Scarlett Johansson’s Amazon Alexa Super Bowl Ad May Be Fun, But It Also Raises Serious Questions

Scarlett Johansson and Colin Jost poke fun at an Alexa mind-reading future in this year’s Amazon Super-Bowl ad, but is the company hiding worrying longer-term aspirations in plain sight?

Andrew Maynard
EDGE OF INNOVATION
8 min readFeb 12, 2022

--

Amazon’s 2022 Super Bowl ad wonders what it would be like if Alexa could read minds …

This year’s Super Bowl ad from Amazon feature Scarlett Johansson and husband Colin Jost imagining a future where Alexa reads their minds. The commercial’s a playful take on places we probably don’t want smart speakers to go as they try to serve our every need. And yet, fun as the ad is, even as Jost concludes “it’s probably better Alexa can’t read your mind,” it seem that this is exactly where Amazon and others are taking us!

Smart speakers like Amazon’s Alexa, Google Assistant, Apple’s Siri, and others, are big business. According to one analysis, the global market in smart speakers is heading toward $24 billion by 2028 — all for a technology we didn’t even know we wanted just a few years ago. And as manufacturers fight for their share of this market, the race is on to make these devices more desirable and indispensable than ever.

Under the Smart Speaker Hood

The basic technology behind Alexa and other smart speakers is deceptively simple. When triggered by a certain phrase like “hey Siri”, “hey Google”, or simply “Alexa …”, the device records what’s being said and sends the recording over the internet to an automated data processing center.

Once there, powerful artificial intelligence-based algorithms make sense of what the person speaking with the smart speaker wants and sends the relevant information and instructions back to it.

These may be as simple as providing the results of an internet search (like “what’s the weather” or “how old’s Scarlett Johansson”), or playing music. But as more and more devices are connected to the internet, devices like Alexa are becoming increasingly integrated into users’ lives as they allow them to control everything from lights and refrigerators to garage doors, door locks, TV’s, toasters, blenders, and even toilets!

Smart Speaker Super-Consumers

Bizarre as some of the things you can now do with a smart speaker might seem, all of this is a far cry from mind-reading. Yet underpinning this growing use of smart speakers is an interconnected web of sensors and data that is less about allowing you to operate household appliances hands-free, and more about analyzing, predicting and satisfying your every need in a seeming-quest to make us all ever-more-dependent super-consumers.

Of course, we’re no strangers to this. Companies use the same data collection and analysis approach to serve up online ads that can seem eerily prescient at times. These ads aren’t reading our minds. But they are using increasingly sophisticated AI-based systems to predict what we’re interested in from a myriad different snippets of information, from the websites we visit, to the online information we linger over, to the mouse moves and keystrokes we use.

The same technologies that are being used to map out people’s movements, craft personalized ads, predict criminal behavior, and more, are being applied to the data feeds from smart speakers

Limited as this information is — there’s only so much you can extract from someone’s browsing habits — it’s surprising how much AI algorithms can glean from the internet breadcrumbs we leave behind us. And the predictive power of these algorithms becomes exponentially greater when additional data are thrown into the mix — for instance, from devices that also monitor your location, or your activities, or your health.

Data-Driven Prediction

As an example of how much information can be extracted from seemingly innocuous data, an interactive map of the movements of fitness device users published in 2018 allowed anyone to predict the locations of a number of sensitive US military bases.

In this case, it wasn’t the physical location of the devices that was the issue, so much as who was using them and where. As the Washington Post reported, “Adam Rawnsley, a Daily Beast journalist, noticed a lot of jogging activity on the beach near a suspected CIA base in Mogadishu, Somalia.” By piecing together fitness device location data with behaviors that matched what you’d expect from military personnel, places that should have been hard to predict the location of became clear.

Similarly, in 2020 Forbes and other media outlets reported that a beer review app could be used to track military personnel and secret military locations. All it took was a bit cross-referencing between the app data and other social media feeds, and the realization that combining seemingly unconnected data feeds can be used to make what is assumed to be private, public.

Is Jeff Bezos hiding his long-term plans in plain sight as Scarlett Johansson and Colin Jost joke about a mind reading Alexa?

These and many other examples demonstrate the power of combining multiple data sources with AI to extract information that might, at first glance, seem inaccessible. Just a decade or so ago, such predictive power was extremely hard to achieve. Yet with massive increases in the number of sensors surrounding us in everything from cars, TV, doorbells, thermostats, health and fitness monitors and more, companies are finding themselves swimming in a rapidly expanding sea of data.

And with near-exponential advances in AI, this sea is delivering the promise of predictive riches that would have been unimaginable just a few years back.

The Perils of the Promise of Prediction

Despite the rapidly accelerating capabilities of emerging technologies though, there remains a gap between what machines think we’re thinking, and what we are actually thinking.

Yet even with the existence of this gap it’s easy to be seduced into thinking that we — or rather, the machines we build — are capable of reading minds and predicting the future.

This tension between what we think is possible when it comes to predicting behavior, and what actually is possible, is already playing out in law enforcement. In 2020 Buzzfeed revealed that software developed by the company Palentir was being used by the Los Angeles Police Department to identify potential criminals, despite concerns over the ethics or validity of its use. And there’s growing concern over racial bias in AI algorithms that are used to determine the likelihood of someone having committed a crime.

Of course, the current uses of data and AI algorithms to anticipate what someone’s thinking, what they might have done (or might do), and what they might be persuaded to buy, seem a far cry from a mind-reading Alexa. But the same technologies that are being used to map out people’s movements, craft personalized ads, predict criminal behavior, and more, are being applied to the data feeds from smart speakers. And the more that they are able to connect what you are saying with your health data, shopping habits, movements, and behaviors, the more companies will be able to use your smart speaker to seemingly predict what you’re thinking, and act on it.

How far companies will go down this road isn’t certain at this point. But as the smart speaker market becomes increasingly crowded, companies like Amazon are going to have to work harder at standing out and offering something no-one else does.

And this means milking their data streams to the limit — even going so far as trying to anticipate what someone is thinking.

Alexa, Do You Have a Hunch about Mind Reading?

We’re already seeing the beginnings of this move toward anticipation with Alexa. Using a feature called Alexa Hunches, the devices are programmed to identify unusual behavior and offer to act accordingly.

At this point, the feature is limited to noticing when connected smart home devices aren’t in their expected state — for instance if the lights are still on as you head for bed. But it is a step toward the technology combining multiple data feeds with predictive AI to anticipate what you want.

The idea of smart speakers getting deep insights into what’s going on inside your head isn’t as speculative as it might seem

This ability to anticipate the needs and desires of users is a capability that companies are only just scratching the surface of. Researchers are already beginning to explore whether devices like Alexa can be used to spot signs of dementia from analyzing speech patterns. And it’s a relatively small step from here to combining smart speakers with data from personal health monitors, internet-connected devices and other sources to predict early signs of memory loss.

But why stop at memory loss? Emerging research is indicating that, when coupled with AI analysis, the subtleties of how you speak and the sound of your voice can be used as indicators of a range of diseases and conditions. By monitoring the pitch, tone and rhythm of your voice, as well as your breathing patterns, machines are becoming increasingly capable of diagnosing everything from Parkinson’s disease and cardiovascular disease, to Post-Traumatic Stress Disorder.

It’s entirely plausible that the Alexas of the future will be able to not only tell you what the weather’s like and turn on your TV, but also what the state of your health is — or your mind.

The worrying thing is, they may also end up telling others this as well, whether you want them to or not.

Of course, this still isn’t mind-reading, but it’s hinting at smart speaker capabilities that begin to nudge toward being able to peer inside your body — and even your brain — by extracting information from data in ways that lie far beyond what we can do as simple humans.

And in a world where companies are fighting to stay ahead of the smart speaker competition, there’s little to stop the increasingly invasive use of these technologies — especially when consumers are being conditioned to believe that they need these technologies to live full and healthy lives.

Hidden In Plain Sight

It’s here that I have to wonder about Amazon’s aspirations for increasing the anticipatory and predictive power of Alexa to the point where it feels like the device really is reading your mind, even if it’s just a high-tech sleight of an AI’s hand.

This would make a lot of sense in a future where such capabilities are increasingly likely to be possible, and where maintaining a competitive edge is everything. In which case, is Jeff Bezos hiding his long-term plans in plain sight as Scarlett Johansson and Colin Jost joke about a mind reading Alexa?

Of course, much of this is speculation. But in this rush to be ahead of the smart speaker tech pack, what would happen if the capability for Alexa to truly read your mind did arise?

Alexa, Meet Neuralink

The idea of smart speakers getting deep insights into what’s going on inside your head isn’t as speculative as it might seem. Elon Musk’s company Neuralink is already lining up human trials of its wireless brain-machine interface. And while early applications of the brain implant will most likely address medical conditions, the company’s aspirations are to create low-cost implants that connect your brain directly to the internet via your smart phone.

If and when we get there, it’s almost inevitable that someone will come up with an app that connects smart speakers to these smart implants. When they do, Johansson and Jost’s dreams of a mind-reading Alexa may not look so funny.

Of course, we’re not there yet, and the good news is that, after her dream-foray into an Alexa mind-reading future Scarlett Johansson concludes that this is, indeed, a “bad idea.”

The question is, is this enough to keep Amazon from going there?

--

--

Andrew Maynard
EDGE OF INNOVATION

Scientist, author, & Professor of Advanced Technology Transitions at Arizona State University