The unintended consequences of technology
Escalators were invented to get us up quicker than the stairs. Since we’re standing still on them, we’re actually up slower than before.. (the escalator paradox). This was unintended.
When we are confronted with a new technology, at first we ignore it, then we will deny needing it, but eventually we will passively accept and use it. So technology, instead of being the tool that we would like it to be, is often actually the active, dominant force in the human-technology relationship. And we submit to it, even when it has unintended consequences.
We already have merged with our mobile phones. We use our phones for work, to make payments, to read our newspapers, to remember important dates, to play games, read books, listen to music, buy stuff, read reviews, anything really.
But nobody needed a mobile phone when asked 20 years ago. Now we are on our mobile phones 3 hours per day (up from 1 hour 5 years ago. Check your own mobile behaviour on Quality Time app). Mobile phones are shorting our attention span, are making our questions lazier, are preventing us from being ‘off’, always available for work, always comparing social status. Was that what was intended?
Personalisation was used by Google to improve the search engine by offering you relevant results to your search. But Facebook is filtering your timeline to show you only news that is deemed relevant to you (and your friends). We call this the filter bubble. The same media channel is enabling what they call micro targeting: sending (fake) news to a pre defined audience directly, who will spread the news in their network. In this manner, a complete bubble might accept as a reality some news fact that is made up, and uncontrolled by independent media. That was not what was intended.
‘Computer says no’ was a ongoing sketch in tv comedy show Little Britain. A couple of months ago it was literally said to my carpenter when he was refused an upgrade of his dental insurance. We seem to accept the authority of (algorithms) computers. When the system produces a “No”, what can you do, right? But a (algorithm) computer can only act as it is instructed. By a human. Computer says no, means human says no. And you should be able to confront this person when you disagree. ‘Weapons of math destruction’ is an interesting book that deals with this topic. Because that was not what was intended.
We are so used to getting our digital tools for free. And even when we realise that we are actually paying with our data, that is being monetised and used to add to our profile, most people seem fine with it. But all this data is being gathered and owned by just a handful of companies. Google can recognise 100’s of millions of people with 98% accuracy. Through Waze they know where you have been and for how long. Through Whatsapp they know who you hang out with, when and where. They know what you like to eat, which restaurants you go to, what will be you next holiday destination, what shoes you like, what you read, watch, listen to. A data economy, owned by a few giants that is not what we wanted.
I love the exponential growth of technology. I work in technology. If technology can help solve the great challenges like hunger, diseases, climate change than that’s great.
But we have to realise that technology always bites back, and the unintended consequences are, well..unintended.
What we can do about this is part of my new talk ‘The unintended consequences of technology’. For speaking engagements please contact Speakers Academy.