The advice is aimed at those suffering from a modern phenomenon known as “death by GPS” – an extreme version of something most of us have already encountered. It happens when someone follows SatNav instructions even when it is plainly obvious that the route is wrong. In 2012, Apple hurriedly amended its Maps app after a glitch began guiding Australians hoping to visit the city of Mildura into remote wilderness.
Three years later, a driver ignored road warning signs and followed his GPS navigation off a demolished bridge in Indiana.
And on a smaller scale the small South Yorkshire village of Wales often plays host to tourists and lorry drivers who think they have arrived in another country.
Such incidents may seem insignificant when set against the huge benefits of technology.
But what is going on in someone’s head when they blindly follow technology, even if it leads them to act in a dangerous way? The answer, I believe, is AI – not Artificial Intelligence but Artificial Idiocy.
This is the human tendency to believe and follow whatever their smartphone tells them.
Consider two tweets sent by the same person in the aftermath of a terror attack on April 23, 2018, in Toronto, when a deeply troubled young man drove a rented van through a crowd, killing 10 people.
The first tweet reported an eyewitness identifying the attacker as an angry man of Middle-Eastern origin. The second tweet reported an eyewitness identifying him as white.
The second was correct, while the first was wrong: the attacker was a white 25-year-old from Ontario.
SatNav instructions can sometimes suggest a wrong route
Yet an investigation by Chris Meserole of the Brookings Institution think tank revealed that, in the 24 hours following the attack, the erroneous tweet received 1,000 per cent more retweets and likes than the correct one.
Once enough people had picked up on the false narrative of Middle-Eastern terrorism, Twitter’s own algorithms boosted its visibility.
“At its worst,” Meserole noted, “this cycle can turn social media into a kind of ‘confirmation bias machine’, one perfectly tailored for the spread of misinformation.”
By the standards of fake news in the 21st century, sharing one inaccurate tweet may seem pretty mild.
But the differing fates of these two pieces of information, released at the same time by the same person, illustrate the problem of digital platforms that value emotional impact over accuracy.
And a system that does this not only embodies a certain worldview but also elicits certain emotional responses in its users.
Anger, anxiety, disgust and surprise are much more likely to go “viral” than doubt or qualified endorsement.
You do not see many trending tweets beginning: “I’m not sure what to think about…” It is all about certainty, rather than constructive debate or disagreement. The same applies when speed of response is prioritised over measured insight.
A lorry driver stuck in a lane
Equally, the cute animal pictures and videos that pop up on your Facebook feed are not as random as they seem.
They are fed to you via sophisticated algorithms of the sort used by casino gaming machines.
And they are solely designed to encourage you to keep clicking on that smartphone or tablet.
When the world’s most influential social media platforms deploy these systems, dangerously deluded thinking is being actively engineered.
This is not to say that people themselves are stupid.
Online, humanity may often look like a mess of tribal emotions and baseless rumours. But this tells you as much about online environments as it does about humanity.
People can be plenty of other things in different circumstances.
Education, democratic debate, community, family, faith all call forth different “selves”.
However, while a service engineered to produce Artificial Idiots may prove profitable in terms of data and profiling, it is also an appetising hunting ground for those who would seek to deceive and control us. I have spent the past two years writing a thriller set in the world of the dark net and its shadowy manipulators, in part because many realities I’ve come across in my research feel too strange to handle any other way.
Van nearly plunges off cliff
Would anyone have believed a decade ago that hackers could subvert everything from children’s toys and baby monitors to TVs and fridges, turning them into “zombie” devices used in mass cyberattacks or for covert surveillance? Yet this is one of the most alarming consequences of the so-called Internet of Things, whereby otherwise ordinary electronic devices are now connected to the internet and able to “speak” to each other.
Would even the boldest futurist have predicted an official US government inquiry concluding “the Russian government interfered in the 2016 presidential election in sweeping and systematic fashion”, in large part through the manipulation of social media? Yet this is taken from the second paragraph of the Mueller report.
We live in a world where millions of people have willingly invited corporate switch your remind what it surveillance into their homes. Google Assistant, Alexa, Siri… like SatNavs, they are wonderfully smart and useful.
But they also represent an invisible transfer of power towards a small number of data-hoarding giants.
They know every purchase we make, every song we listen to, every movie we watch, every book we purchase, every route we drive and every holiday we book. They also know about the intimate conversations we have with our children and partner.
Forget the speculative future of a world dominated by super-intelligent machines. What technology author Shoshana Zuboff calls “the age of surveillance capitalism” is already here.
This is an era where behavioural monitoring, prediction and modification represent new frontiers for profit.
It is, for many of us, a supremely comfortable and convenient place to live, so long as we do not think too hard or bump up against too many of the boundaries being placed around our behaviour. So what is going on when someone follows the directions given by their SatNav down a dead-end road into the middle of a desert? They are uncritically accepting whatever is on their screen, while failing to check it against what is actually happening.
When you walk around a city looking at Google Maps, you too gratefully accept its reduction of complexity into something instantly accessible: the sights, the bars and restaurants, together with their ratings.
If the system does not know about them, they do not exist.
And if all you ever do is accept this, if you never make the time to pause, to consider other ideas and values and frames of reference – then the system’s ignorance becomes your own.
What’s the alternative? Switch off your phone and remind yourself what it is like to be fully present in the moment. Take a walk with only your eyes to guide you.
Start a conversation with a stranger, give a loved one your undivided attention, pause before sharing a story on social media, think twice before agreeing to another unread set of terms and conditions.
And, whatever you do, do not believe everything you read online.
This Is Gomorrah: The DarkWeb Threatens One Innocent Man by Tom Chatfield (Hodder & Stoughton Ltd, £14.99). Call the Express Bookshop on 01872 562310, or send a cheque/ postal order payable to Express Bookshop to: Gomorrah Offer, PO Box 200, Falmouth, Cornwall, TR11 4WJ or visit www.expressbookshop.co.uk UK delivery free.