When I consider the mass adoption of AI- and the exponential and ever-increasing applications of automation, machine learning, robotics and chatbots- I don’t particularly worry about the estimated 90 million job losses AI will supposedly incur globally over the next decade. Efficiency gains and technology leaps- from the 1st Industrial Revolution to Industry 5.0 – with the first introduction of machinery, and the Ford automated manufacturing line- has shown that industry- and people- are resilient. New jobs are created for the ones that are lost. New domains of expertise and technical skills are required for those that have become obsolete. For those industries that can be flexible and adapt- whilst fast-paced technological change may force a little soul-searching- it is not the death kneel of its business.
No, what I worry about is not job losses. Nor is it potentially sentient computers. Nor is it computers getting smarter than humans- because in many ways, they already are.
I worry that artificial intelligence will eat at the brain of future generations.
With automation fast-replacing human cognitive functions in the workplace, we are creating the perfect breeding ground for an unprecedented skills gap in the years to come. The rise of expertise automation and augmentation software (EAAS) to industries such as journalism, law, consulting, and healthcare is eradicating the most precious thing a human has- what has always differentiated us from other species, and now, machines: Our mind and imagination.
CBInsights most recent report on the State of Automation was for me an electric jolt to the system.
A growing wave of AI-infused Expert Automation & Augmentation Software platforms will usher in a new era of AI-assisted or AI-enhanced productivity.
Yes, indeed it will. It will eliminate countless repetitive tasks among white-collar workers- such as reading through thousands of pages of litigation documentation, writing an article, creating a marketing campaign- or even coding software. It’s more efficient, it’s more productive- and in theory employees can be better utilized for other subjective skills and competences that can not- at present- be conducted by AI.
But as AI get’s smarter- and it will– this technology will undoubtedly replace these very cognitive and subjective skills that us- employees, people, humans- use on a daily basis. Our brains.
I have already started to notice a distinct cognitive skills gap among millennials. The bottom of the curve started with the rise of social networking and social media within a workforce that is primarily composed of digital natives. It created new type of skill-sets and job functions that never existed previously. But most critically, it has created a new attitude within the workplace. Convenience, speed, social, experiential- the very things that technology has brought us in this generation has changed millennials attitude to work and tasks. Rather than feeling rewarded by ones ability to think, perform and learn, it’s a competition towards who can automate the latest software most efficiently and rapidly to make sure the technology does your job. This is great for business- but concerning for the continued growth of civilization.
But the most at-risk are Generation Z. By the time they hit the workplace, some jobs, which previously demanded human expertise and cognitive functions will not even exist any more. Machine learning software and predictive analytics are much more accurate then financial traders and investment managers, even today. Google is already testing self-writing code logic- which will soon make software engineers unnecessary. Startups in the media industry have already proven that a computer can not only write a better article than a journalist, but incorporate the targeted data searches, personalization and interest attributes that would be nigh-on impossible for a human to do.
Generation Z may grow up in a society where teachers, educators and scientists will spend more time training a machine’s cerebral functions, than training the bright minds of tomorrow. The ‘fake news‘ scandal already shows signs that a younger generation are already finding it difficult to differentiate between real and fake news- and this has nothing to do with the source of information or intelligence. It has to do that many young minds are no longer schooled to critically think, read between the lines and analyse multiple sources to form a balanced opinion. No- if it’s on Google, from X or Y source, it’s obviously real.
This equates not to a loss of jobs, but a loss of brain-power. I am fearful the next generation will lose their critical-thinking abilities, which in turn, over years- maybe decades – will lead to the eradication of our ability to not only perform certain key skills, but think for ourselves.
For me, this is the end of civilization- when the machine no longer compliments human advancement, but over-takes it, and renders it useless- because we chose convenience.
To this end, we need to maintain a fine-balance between technology leap-frogging, and skills replacement. We need to adapt education not only to the necessities of digitalization and technology- but to ensuring that technology enables the cognitive and subjective growth of human functions. We need to value and reward our thinking abilities, and our imagination- and instill this in a future generation. Because after all, this is what enabled us to create this technology in the first place.