Our predictive skills are about as reliable as a crystal ball. Andrey_Popov/Shutterstock
The phrase “fourth industrial revolution” has become ubiquitous. It’s meant to denote a huge shift in the socioeconomic fabric of society, driven by the availability of increasingly intelligent machines. These will be able to do things we can’t do as well as take care of things we can do. Jobs will be lost. And new jobs will be created.
The fourth industrial revolution idea owes much of its credibility to a book by engineer, economist and World Economic Forum founder Klaus Schwab. He argues that an interconnected world, a cheapening of computer power and storage, developments in artificial intelligence, and advances in areas of biology will have revolutionary effects on our world.
He lays out a range of predictions, of greater or lesser confidence, about what these effects may be. And he argues compellingly that we need to apply ourselves to the human dimension of the revolution: to considering, and taking control of, the effects of it on social inequalities, poverty levels, political structures, labour, the way we assess productivity, and, deepest of all, what it really means to be human, given that so many formerly human tasks will be done by machines, some even via augmentation of human bodies.
It’s a good book, but has its weaknesses. It’s historically not very nuanced; it focuses on economics at the expense of politics. Most importantly, it appears to suffer from “confirmation bias” – the tendency to see any evidence as supporting your view, and to discount evidence that doesn’t.
These strengths and weaknesses reflect the strengths and weaknesses of the wider debate around the fourth industrial revolution. When the idea is used as a stimulus to reconsider what we are doing and think about the future, that’s great. When the narrative morphs into a series of predictions about life in two, 20 and 200 years, it’s easy to lose the plot.
Allocating resources and design strategies based on the predictive content of the fourth industrial revolution narrative would be dangerous given that even two decades ago it was impossible to predict the pace of technological development we’ve seen.
So caution is necessary. We can’t simply work out what is going to happen during the fourth industrial revolution, and place our bets. That’s because people’s predictive powers, never strong, become much worse when we are in the grip of a “big idea”. They become not merely bad, but worse than random.
The tortoise and the hare
Psychologist Philip Tetlock has conducted large multi-decade studies of socio-political predictions since the 1980s. For example, he asked people to make predictions about the future of communism and capitalism. His results, presented in his book Expert Political Judgment, are striking.
It makes no difference whether you are intelligent, a subject expert, have access to classified information, have a PhD, are left or right wing – none of the traditional markers of expertise translate into improved prediction performance.
The only significant variation relates to cognitive traits that Tetlock characterises as “fox” and “hedgehog”.
A fox has many ideas. A hedgehog has one big idea. In the original fable by Aesop, from which Tetlock draws these creatures, the point is that this one big idea (rolling up into a ball and sticking your spikes out) is enough to defeat the quick-witted fox. But Tetlock draws the opposite moral for prediction. Having one big idea to which you are fundamentally committed makes you far less likely to be a good predictor.
This result has important consequences. It explains why pundits are so often wrong, missing all the huge events of recent times and getting others wrong. Pundits make it because they exude confidence, which is characteristic of the hedgehog, who sees the world in clear and simple terms, and usually absent from the fox, whose world is complex and uncertain.
Fox-thinkers aren’t exactly great as predictors. But they are better than random, and certainly better than hedgehogs. Their scepticism, uncertainty and humility mean they will change their minds when new data come in. This is obviously rational, and the data show that looking for opportunities to change your mind – asking what could possibly go wrong – makes for a far better prediction strategy than hedgehog-like adherence to a single idea.
Beware of hedgehog thinking
There’s a great deal to applaud in efforts like Schwab’s to consciously review contemporary circumstances. But we need to be careful of the temptation to adopt a single lens, whether rose-tinted or grimy, for understanding a complex world.
A critical stance is essential if the fourth industrial revolution is to be a stimulus for debate rather than a dogma.
So, if you see the fourth industrial revolution everywhere, beware: you may be in the grip of hedgehog thinking – just as you are if you reject the entire notion.
As Tetlock’s work shows, if you see the certain future events as inevitable, and wonder how others can’t see that too, then you’re probably wrong. It’s better to remain inquisitive, uncertain, critical, and apportion your belief to the evidence. This is how humans will benefit from the fourth industrial revolution, and how we will take control of it.
About The Author
Alex Broadbent, Executive Dean, Faculty of Humanities and Director, African Centre for Epistemology and Philosophy of Science, University of Johannesburg
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Related Books
at InnerSelf Market and Amazon