The future of jobs and education

A pilot distribution of Chromebooks in a school.

A pilot distribution of Chromebooks in a school.

Credit: Flickr

Today's education system is preparing children for jobs that won't exist in the future. It's time to rethink our children's future.

Life is a learning experience, or so they say.

Broadly speaking, educational activities can be split into two categories: life skills and professional skills. The life skills that we all need to learn, and the way we learn them, have remained relatively consistent across the ages: how to communicate, socialize and survive. But you can argue that today’s education system is skewed toward the second category, the teaching of professional skills and it’s this category that will face the greatest opportunities and challenges over the next 50 years.

While educators prepare students for lives of learning, it’s more true to say their role is to prepare students for lifelong careers. But while that was a relatively simple task in the past, it’s now much more difficult. In the past educators could teach someone law, for example, and it would be fair to say that we could have expected those students to remain gainfully employed within their chosen profession for at least 40 years. Today, though, technology is replacing paralegals and, while I don’t take this view, there are skeptics in the legal profession who are publicly proselytizing that their industry will be completely automated by technology within 15 years. In a world where educators are used to educating Generation Y, Z and -- eventually -- Alpha students for lifelong careers, what happens when jobs, at least for humans, in those careers -- and even across entire industries -- cease to exist?

Furthermore, and as a demonstration of society’s titanic shortsightedness, we live in a role-based society where the majority of people, particularly in business, typically believe that people are only suited to fit one role. A software salesperson can’t sell hardware, a hardware salesperson can’t become a lawyer, a lawyer can’t become a data scientist, and a data scientist can’t become a software salesperson, and so on. If we are to assuage some of the hundreds of millions of redundancies that we’ll all face in the next 20 years, society has to change its thinking. After all, there’s no reason why many of us can’t learn new skills.

The changing job market over the next 20 years

Governments and academia estimate that between 30% and 50% of all of today's jobs will be replaced by technology. However, unlike the disruptions of yesteryear, where technology replaced blue collar jobs, today's technologies are replacing white collar knowledge workers, and it's this shift that could have dire consequences for you and your children’s career prospects. Some of the world’s best self learning artificial intelligence systems and cognitive computer systems are already replacing advisers, analysts, artists, commentators, consultants, doctors, journalists, musicians, paralegals, Ph.D.s, teachers, translators and even the data scientists who created their original algorithmic models. Machine vision systems are replacing quality inspectors, maintenance workers, security analysts and security guards. Robots have already replaced many blue collar factory and warehouse jobs, and now they’re replacing bar staff, maintenance workers, porters, soldiers, waiters and surgeons while their new, modern-day software-only counterparts are replacing administrative personnel, customer service clerks and FX traders.

In other areas, autonomous vehicles – from cars and trucks to aircraft and half-million-ton cargo ships -- are eliminating drivers, operating crews, parking attendants, pilots, sailors and traffic wardens. Avatars are replacing actors, bank tellers, call center agents, teachers and pre- and post-sales support staff. Cloud computing reduced the need for change managers, enterprise architects and operations staff while the internet of everything is reducing the need for engineers, inspectors, facilities managers and maintenance staff. Smart city technologies will reduce the need for police officers, street cleaners and myriad of other public servants, while wearable systems and telehealth technologies are reducing the demand for secondary care workers, doctors and personal trainers.

The lists go on.

Look around you, and I’d be surprised if you yourself aren’t already seeing some of these changes occurring -- albeit gradually, for the moment.

Unlike the industry disruptions of the past, though, where jobs were destroyed but where new ones sprang up, worryingly no one, from the UN to the G8, has any idea what the jobs of the future might look like. And while some point to jobs that need creativity, empathy and social skills, machines are even acquiring those skills.

The future of education

Education is one of society’s cornerstones. After all, as government spokespeople say, it’s what prepares us to become “useful and productive members of society.” But as the job market continues to shift, education needs to shift ahead of it -- decades ahead, in some cases -- in order to prepare people for jobs in 20, 30, 40 and even 50 years time. And the less that we talk about people living longer and having to retire even later in life the better -- let’s keep things simple here.

The education industry has a unique dilemma. On average it has 18 years to prepare people for careers that span 50 years or more, and as the pace of technological change continues to accelerate, trying to provide people with skills that keep them sharp and employed throughout their lifetimes is no small feat. As we’ve seen, if the analysts are right, then by 2036 at least one-third of today's jobs -- the ones that ostensibly most academic institutions are busy preparing Generation Y and Z for -- will have been taken by machines.

The upshot of all of this is that the education industry needs to be developing hard and soft skills curricula that prepare students for a changing world. But often the education industry is at least one or two generations behind the technology curve -- a curve that's increasing exponentially. Take, for example, today's chronic shortages of cybersecurity experts, data scientists and software developers -- roles that are cited time and time again by the Fortune 500 and government agencies as being in great demand.

For Generation X, whose members are often at greatest risk of being made redundant, these topics, and the subject matter underpinning them, would have had to have been included in the 1960s and 1970s curricula. How many schools in the '60s prioritized programming as a subject – how many do it today? Even now, in the “digital age,” the answer is very few. While we could argue that schools are beginning to catch on to software programming as a crucial part of the curriculum, today’s scientists have moved on and are programming biology. Where is the training for those skills? The apparent difficulty we have in forecasting the jobs of the future could leave two or more generations struggling to find work.

In the meantime, the education industry faces a unique opportunity -- the opportunity to reach out to everyone on the planet and provide them all with access to insightful, valuable content. And there are three areas -- curation, distribution and consumption -- that are all going through paradigm shifts.


Every one of us has a unique individual learning style, so curating educational materials that get the most from each student is no small task. In the past, the majority of content was standardized, and the same materials were pushed out to every individual in the same way -- irrespective of ability or learning style. Inevitably, some people took to it and others didn’t, and those who didn’t they got left behind. Over time, textbooks and course content became digitized, and now we have a plethora of interactive apps, e-books and other on-demand materials. The digitization of the education industry, albeit gradual, is a turning point. Now in digital form, materials are accessible to everyone and everything -- humans and machines -- and as a consequence they can become smart.

Today, all of our educational course materials are curated by humans, but there will come a tipping point. By 2030, equipped with artificial intelligence and with unlimited access to powerful cloud computing resources, curriculum-savvy machines will be able to tap into the power of the cloud to create materials that, based on a wealth of available data, are specifically tailored to the needs, aspirations and learning styles of each individual student. Device-embedded cameras and machine vision systems will be able to analyze students' facial expressions and body language to calculate how engaged and invested they are, as well as how easy or difficult they’re finding a particular subject or topic. Behavioral analytics will analyze writing styles, patterns and speeds to measure competency and detect early signs of learning disabilities such as ADHD, dyscalculia, dysgraphia, dyslexia and even memory problems.

With so many different modes of feedback, the A.I. engines, or A.I. directors as they will come to be known, will create rich, adaptive, personalized educational materials in real time. Gamified and embedded with A.I., behavioral, contextual and semantic analytics, augmented reality (AR), natural language processing and universal translation and virtual reality (VR) content will be able to take on a life of its own, even going so far as to provide students with their very own, personalized VR teaching avatars that could take the form of anything from a talking tree to a representation of Jonny Depp.


Over the past 20 years, we have seen a significant change in the way content is distributed. In the past, students had to be in classrooms, but now educational institutions like MIT and Harvard, and even private companies like P&G and General Electric, are offering students from around the world opportunities to attend their own versions of massive open online courses (MOOC) -- classes run over the internet, often for free, that have hundreds of thousands of participants.

The proliferation of new channels creates new opportunities and new problems for educators whose curricula are often standardized and highly regulated. The internet and the proliferation of new over-the-top (OTT) content -- via channels that can include YouTube, WhatsApp and even Disney, Harvard and the app stores -- means that children have access to a world of new material of variable quality and sometimes questionable perspectives.

As more and more content goes OTT, how we find it and where we find it will also change. Today, we’re already beginning to witness the creation of smart content. Ostensibly the third wave of disruption to hit the content industry -- the first being printing and the second being the internet -- smart content is content embedded with A.I. and machine learning that, rather than waiting for its audience to seek it out, seeks out its audience instead. Imagine, for example, content that can analyze and see new trends in the job market months or years before they materialize and can push the the right types of content to you so you are prepared, virtual CV in hand, for those trends when they materialize.

Today only 3 billion people are connected to the internet, but over the next decade new stratospheric network platforms like Google's Project Loon, Facebook's Project Aquila, and OneVu will connect the last 4 billion people, giving them all the same access that you and I take for granted. While we might think that distribution is already ubiquitous the fact remains that only 40% of the planet is connected and that in itself presents educators with an opportunity.


Consumption will be one of the most rapidly changing parts of the learning equation. Students will increasing become accustomed to A.I., AR, avatars and VR-powered content, but over time those technologies will give way to platforms (for which there are already working prototypes) that use brain-computer interfaces (BCI) to transmit content directly into our brains. The adoption of all of this content, whether it’s VR- or BCI-based, will always be influenced by accessibility, affordability and design. And the easier the content is to absorb, the more potential we’ll realize.

The good news is that many of today's modern millennial organizations have already embraced a culture of design thinking. How many of you think that 3-year-olds would be embracing technology as quickly as they do if it was difficult to use? OK, they may need an adult's finger to unlock an iPad, but I’m guessing that -- if they’re anything like my children -- they know how to navigate and use their parents' gadgets with impunity and that, in some cases, they can use them better than their parents do.

Today and in the future, it’s often the frictionless customer experience, often dubbed design thinking, that accelerates the pace of adoption of new technologies -- systems that are well-thought-out and implemented correctly. Generations Y, Z and Alpha have shown us that they can embrace new technologies like ducks take to water. And as new technologies get curated into new products and services (moral and ethical implications aside), they will have no qualms about embracing technologies and capabilities that not so long ago were thought of as magical.

As one student said when asked to describe today’s technology to a time-traveling  visitor from the 1800s: “In my hand, I hold all of the world’s information.”

Just think what you could do with that...

This article is published as part of the IDG Contributor Network. Want to Join?

Drexel and announce Analytics 50 award winners
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies