In this multi-part series, Robert Bolton talks about how far we’ve come since first thinking about automation and technology, and sets the stage for an in-depth exploration of the intersection of computer science and human resources.
As long ago as the late 1960s, artificial intelligence can be said to have come close to passing the “Turing Test” detailed in my article introducing this series. Now, in the second decade of this century, it is clear that the machines have graduated, and are coming to work. In many cases, they work for us as individuals. The way in which human individuals socialize, play, exercise, buy and communicate has fundamentally changed as a result of advances in digital technologies.
But actually working with us as business partners is another matter entirely, and “artificial intelligence” is just one aspect of cognitive function. Indeed, there are actually several different cognitive capabilities that would be required by a business looking to employ robotic or automated employees.
Certain capabilities are commonplace among the devices we already work with, including process automation and digital interactions, but many others are not yet advanced enough to be of use. Still, the revolution is coming, and the Bank of England estimates that 15 million jobs will disappear from the U.K. economy in the next 20 years due to robotic automation, and this will definitely be a global phenomenon. It is not a matter of if; it is a matter of when and who.
Unsurprisingly, the cognitive capabilities already prevalent in modern technologies deal mostly with hard data. After all, even the machine created by Alan Turing in the 1940s and featured in The Imitation Game was a successful demonstration of process automation for interpreting data.
The cognitive revolution will augment those abilities with decision science in order to not just collect and interpret data but also analyze it.
This was examined in a 2013 paper entitled The future of employment: how susceptible are jobs to computerisation [PDF 1 MB) by Carl Benedikt Frey and Michael A. Osborne of Oxford University. It’s also no surprise that in their paper, Frey and Osborne found that industries that deal heavily in data face a high likelihood of automation. In fact, they discovered that jobs like accountancy and auditing have a 94 percent probability of computerisation due to forecasted productivity gains of 30 percent and ROI calculations of between 600 percent and 800 percent.
What may be more of a surprise, however, is that they also theorized that a whopping 47 percent of all U.S. jobs “were at risk”. Stranger still, middle-ranking jobs, according to the World Bank, are just as susceptible to automation as more repetitive (and thus more obvious) candidates.
So what makes those 47 percent of jobs more at risk than the rest? According to Maarten Goos and Alan Manning of the London School of Economics in a paper from 2003, there are specific qualities shared by jobs that are less likely to be automated in the near future. Goos and Manning predict that these careers are good candidates for being supported by cognitive augmentation.
These qualities are:
With those in mind, Goos and Manning foresee a “hollowing-out” of the workforce, represented by the following diagram.
With this dynamic—in which the human workforce is required only for low-skilled but high-dexterity jobs or for high-skilled work—they coined the phrase “Lousy or Lovely” to describe what to expect by the removal of middle-income routine jobs.
To learn more about the integration of digital and human labor and the workforce shaping process read Rise of the humans and listen to KPMG’s Anticipate podcast.