“Elementary, my dear Alexa!”

Is IoT a credible witness? This case raises some important questions…

Earlier this year, Alexa was witness to a horrific murder. You probably heard about it: the Hot Tub Murder case in Arkansas, in which a man was found face down in a hot tub after drinking at his friend’s house.

This would not necessarily point to foul play, but in this case there was a wider constellation of evidence for the detectives to consider, extending to the homeowner — and prime suspect’s — smart-home devices.

The plot thickened in a way that really strikes a chord for those of us designing for the IoT landscape and the information it generates, and highlights key areas of concern for us in designing a connected experience: data credibility, data contamination and privacy controls.

After the police noticed that the crime scene had been hosed down suspiciously during the night of the killing, they queried the “smart” water meter data for any usage spikes in the night; going further, they then sought the recordings saved by the Amazon Echo, which may have been in earshot of an alleged struggle, perhaps unlocking key clues to the crime.

(Remember, the Echo records questions that are put to Alexa, including the moment just before and after the wake word, and saves the information to the cloud.)

“Witness to Crime” is a new and unexpected service for Amazon to add to the Echo’s list of capabilities, and the company did initially resist the request for data from the prosecutors on First Amendment grounds, concerned about the domino effect such a new investigative paradigm could bring. In the end, however, the defendant agreed that the recordings could be used as evidence — as an ex-cop, he was clearly confident that Alexa wouldn’t implicate him. The Alexa constitutional drama is over…for the moment, at least. We shall see how the case in unfolds, as they are back in court soon.

Currently, we’re used to security systems and cameras, both in our daily lives as well as played back to us in our courtrooms, but now that the field of evidence gathering has been extended into devices intended for the Smart Home, IoT could serve as anew Sherlock Holmes. The scope of digital forensics has widened dramatically, especially with the Echo’s compatibility with other devices atthe top of everyone’s mind.

Not to exaggerate, but there are some huge questions here for… well, mankind, really.

Let’s start with the credibility of the data. Is a smart home-product like Alexa — in any product shape, be it an Echo, Dot, Show or even Look — ever a reliable witness? Will she stay that way? Much of the time, when I ask her to play a song, she can’t understand me and parrots back nonsense that, in a different lens, could possibly be misleading. For example, “Alexa, play the Killers!” easily becomes “Alexa, we’re the killers!”

(Indeed, it could be kind of fun to imagine how you might frame someone using these smart devices — if you were, you know, that kind of person.)

Suffice to say that, like most eyewitnesses, Alexa sucks. And, as a listening device, the Amazon line doesn’t currently perform vocal authentication in order to distinguish who’s speaking, meaning the landscape is indeed ripe for imposters and hackers. Even if it does get an upgrade to voice authentication, the Echo is still easy fodder for both imposters and hackers.

It does raise interesting questions, however, and by extension, we have to consider how the data generated by these devices might come to intersect with other data and allow for cross-contamination. When you have one possibly unreliable witness sharing data, or colluding and possibly mingling data in an artificially intelligent, networked service without human intervention, does that lead to potential data hysteria or contamination across devices? What might get thrown out of whack?

And where is the jurisdiction when the control of these devices is done remotely, generating data and sending it back and forth and across international borders and servers? Where is the true perimeter now?

And finally, that question leads us straight to privacy issues: Who owns the data — pure and aggregated? Who is really consenting to what in a listening ecosystem, one in which children are usually present? What are the implications of erasing the data?

The coming GDPR regulations in Europe for 2018 have strict controls in place for the use of all data, especially biometric, butwe shouldn’t be at all confident the smart landscape of IoT products and services entirely conforms to them currently.

We don’t know the answers to most of these questions yet and in many cases, it may be the courts that end up deciding— as this murder case has suggested.

As designers working in the reality of now, however, actively working to invent these solutions, the challenge is immense.

Designing for and with data can be tricky, but considering questions raised by the Hot Tub case is actually an instructive step. We’re working to balance the ideal experience for an actual living person moving through time responsibly within the confines of the law, and we work closely with Data Privacy officers as required to achieve this balance in connected spaces. We should also always try to catastrophize, to predict and prevent any unintended consequences that could befall a user down the line.

“You know my methods, Watson. There was not one of them which I did not apply to the inquiry. And it ended by my discovering traces, but very different ones from those which I had expected.”

-The Memoirs of Sherlock Holmes (1893) Sir Arthur Conan Doyle

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.