How AI could force you to commit suicide
Imagine a few years into the future. Not 50, but perhaps more than 5.
Imagine looking at a screen such as your smartphone (or the future equivalent), suddenly the screen starts to flash a dizzying array of images, colors, sounds, and words. Suddenly you feel very sad. The sadness seems to come from nowhere, but it totally overwhelms you. You are unable to look away, feeling compelled to keep looking. The feeling of utter sadness and hopelessness engulfs you, sending you reeling down an emotional black hole. The screen is showing you an absurd concoction of images, colors, words, and shapes. You feel that there is no hope left. In a split second, you decide life is hopeless and throw yourself in front of a car, ending it all.
What just happened?
You were looking at a screen, such as your smartphone, or similar. The screen was equipped with a front facing camera. An app, running a sophisticated emotion-tracking AI, was watching you. It read your facial expressions and based on a large dataset of other peoples emotions, a dataset made from images, video and similar data points, coupled with plenty of data points from your personal profile etc. It, the AI, determines your current emotional state with uncanny precision.
What it does next is to show you words, sounds and/or colors in quick succession, or slowly for that matter, to manipulate your mood. A couple of thousand times per second, the algorithm evaluates your emotions and calculates the best possible way to nudge you towards the ‘target’ feeling. What the AI decides to show us doesn’t have to make sense, but it will never the less have the desired effect on us. We won’t know why this seemingly random combination of words and colors and sounds makes us feel so sad, they just do. And all the while, the AI will continue to evaluate you and provide input specifically designed to make you feel increasingly worse. Once the AI registers your declining mood, it will push harder.
Your emotions have now been hijacked, and like a deer caught in headlights, you are unable to move. The dataset isn’t false. As a species, we’re not that unique. Quite to the contrary, we are very much alike. And I think this perhaps deserves some extra attention. If you feed a computer enough information about the human psyche, and how our moods work, it will eventually become very good at understanding how we feel. Armed with this insight it will also become very good at understanding how to change our mood.
With the emotional responses of a billion people at it’s disposal, the AI knows exactly what to show us to elicit the desired response. And from moment to moment, running at several teraflops of calculations per second, it will know exactly how to shift your feelings from one state to the next. There are already results published by scientists in New Zealand that can predict what you’re going to do, a split second before you do it.
While we might enjoy the age-old idea that we have free will; that we’ve been endowed with some ethereal, mystical powers that let us pull our decisions from the “depths of our soul”. Much like the idea that the world is flat, this idea is now having to come to terms with the harsh reality (which, frankly kinda hurts. Because I quite like the idea that I am somehow more than just a machine made from meat. ) We do not have ‘free will’.
In a blog-post from DeepMind found here they explain how they used ML to reduce the energy consumption for one of their data centers.
The DeepMind article explains the ‘why’, but doesn’t say much in the way of ‘how’. One suspects that the engineers aren’t exactly sure ‘how’. Which should scare you.
The moral of the story is that armed with gargantuan data sets, extremely capable algorithms and countless teraflops to run them, the computer may well choose paths that seem incomprehensible to us, but does a far better job than we could ever imagine at manipulating us. In the wrong hands, this could have fatal consequences.
Lesson: Don’t look at the screen!
Found this great article while browsing around: