Alexa, Mass Shootings, and My Dreams
This is a story about Amazon Alexa, the role of mass shootings in my life, and my fear of dreaming about a better world.
We don’t dream anymore.
Or, at least, I personally find dreams to be little more than fantasy. A waste of time, or an idle amusement. Sometimes I say, “Oh yeah, someone should do that!” but that someone is never me. I may be wrong, but I don’t believe I’m the only person who has that problem.
Certainly, my dreams are nothing to be made real. Dreams never strike me as anything that could or should make my life materially better or more meaningful. Every now and again, though, I’ll see a beautiful idea that flickers out in the dark ready to catch the tinder of society’s imagination… and I’ll watch as it’s smothered by our fear of what we might be if we only tried. I feel something like despair in those moments, because I know the fear is not only mine.
I understand. I have the same fears. Who are any of us to disturb the course of the universe? Yet when I look out into the world, another part of me rebels.
SpaceX built a reusable orbital class rocket, dropping the cost of access to space by an order of magnitude. A single company, with a shoestring budget, did what entire nations could not do mere decades before with all their military might. The company plans to grow this technology until access to other worlds is no more costly than a nice house in the suburbs.
This is real. This is happening. We will see this with our own eyes.
But where is the rush to build the ships to sail through the void? Where are the companies making plans to capture asteroids and shower the Earth with more mineral wealth from a single capture than has ever been previously harvested in human history? Where are the people inspired to bring life to other worlds so that we can learn how to save life on this one?
They exist. I’ve seen them. But they are rare.
What we drown in instead, what we are inundated with, are smug articles written by people whose science education ended in junior high school, and who seem to think it might not be that bad if all humans died, about how Mars — an utter hellscape — will become a haven for the rich. Or about how it would be nice if humans could get to Mars — ignoring the overwhelming evidence this will happen shortly — but even if they ever did what about the soil? What about the radiation? What about the need to solve other problems?
The assumption exists, implicitly, that problems cannot be solved.
I see the same failure of imagination, of dreams, almost everywhere.
Satoshi Nakamoto invented BlockChain technology to create a decentralized currency so that no government or banking system could control our money. Did we rebuild our financial infrastructure or otherwise hold power to account? Did we take power back into our own hands? No. We made CryptoKitties.
I get it.
Like I said, I’m afraid too.
I dream of a world without mass shootings. More, a world without murder. Where no one owns or wants a gun because it’s an obsolete technology. Where there is hardly even a need for prisons, and it’s so safe outside you could let your child wander through any neighborhood anywhere without fear. For some reason, and I can’t for the life of me figure out why, I’ve never even written down the idea until now.
I think I’m afraid of being responsible for making it.
AMAZON KINDLE, TUCSON SHOOTING
On January 8th, 2011 I was living in Tucson, Arizona and looking for some packing tape. Although I couldn’t find the tape, I was assured by several members of my family that it did, in fact, exist... somewhere.
I’d stayed up late reading a Fantasy novel the night before and dropped my kindle while falling asleep, shattering the screen. Thankfully, I had purchased a protection plan. I simply needed to mail the kindle back to Amazon, and I would be issued a replacement.
It was a great deal.
About thirty minutes after the start of my search, I found some packing tape — probably a different roll than the one my family had alluded to — in the garage, inside an empty box. With a single tear and rip, I finished packing my kindle.
I crawled into my car and prepared to drive to the local grocery store, where there was both a UPS drop box and a US Bank where I could deposit some Christmas money I promised my grandmother I’d accept. No sooner did I start the car than a voice on the radio began speaking, in a hesitant and unsure voice.
It was nothing at all like the polished voice of National News anchors, who are used to things like mass shootings.
It turns out that a congress woman had been in the parking lot I was going to, holding a rally. I’d had no idea. A man named Jared Loughner came up behind her and shot her in the back of the head. And then, for no real reason at all he shot a bunch of other people including a nine-year old child.
I would have been right there if not for the packing tape.
HELPLESS FOR SANDY HOOK
Sometime in late 2013, I spoke with the mother of a child killed in the Sandy Hook massacre. She didn’t identify herself as such, although given her location and that her financial problems had begun due to funeral expenses the previous year, it wasn’t hard to figure out.
She didn’t ask for sympathy or tears. Without calling attention as to why, I offered to do anything I could for her. I’d speak to anyone, go as high as she asked, anything to make her problems go away. In a very tired and hollow voice, she said she was tired of being treated differently and asked for some guidance on how to get control over her life again. As it happened, I was in a position to give her that guidance. I gave all of it that I could.
She talked to herself a bit in the long stretches of silence when I was doing paperwork. Most of it was run of the mill self-chatter. Errands she needed to run. People she needed to call. Then she mentioned her divorce.
“It was no one’s fault, really. We just couldn’t stand to look at each other anymore. I mean, we both kept seeing her face.”
I’d never felt more powerless to help another human being in my entire life.
ALEXA IS LISTENING
In late 2014, probably because I’ve been an avid Amazon user since college I was invited to be among the first people to try an Amazon Echo. I’d always wanted a robot, and this was the closest thing to a robot I’d ever seen. I also remember being amazed that someone had finally meaningfully solved the voice recognition problem. I remember taking it out of the box, amazed that it was pretty much just a cylinder.
Of course, I realized, she’d have to do all her thinking in the cloud.
All you had to do was say her name “Alexa” and she would awaken, and listen to your command like magic.
“Alexa, what time is it?”
“Alexa, tell me a joke?”
“Alexa, what are the three laws of robotics?”
I was amazed at her versatility, at the love which had gone into her design. I wondered what else she might do.
ALEXA AND THE LAS VEGAS SHOOTING
Late October 2017, I saw actual footage from a mass shooting for the first time. Stephen Paddock opened fire from a hotel window into a Country Music festival and killed 59 people. I saw a video from the point of view of the crowd. Someone holding a cell phone. What struck me wasn’t the blood, but the way the music stopped and the panic took over as Paddock fired the shots fired into the crowd.
I could hear each shot.
Snip. Snip. Snip.
I also recalled, for the first time, looking at the problem as an engineer.
It upset me how the problem had become politicized, and that no one was exploring any kind of realistic solutions.
How could anyone at that concert, even if armed, have gotten off a shot that would have incapacitated a man standing 32 floors above? While under fire? Even in a Hollywood action movie, that pressed the limits of credulity.
So I got some paper and wrote down the word “Countermeasure” and beneath that wrote the following list:
1. Must be able to incapacitate any shooter in line of sight
2. Always on, always ready
3. Activates at the sound/sight of gun
4. Triangulates on sound?
5. Cheaper than security guard
Then, because I am not evil I wrote
6. How do you do this without massive violation of Civil Liberties?
I sat down for a long while and decided I’d think out some solutions, for no other reason than my own curiosity. I had no ideas.
So I asked Alexa to set my alarm for the next morning and prepared to go to bed and thought “Oh…”
PHASE I: Big Brother is Passively Listening
Let’s start with the idea of Alexa. I really mean sound recognition, but Alexa is the most common example we have of that technology. We have a small microphone device, and that device is always on and always ready to hear what you have to say. That device has been created to only wake up when you say “Alexa” and then a series of Deep Learning algorithms and Hidden Markov Models turns your speech into text and passes that text to Amazon’s cloud servers to find the appropriate response and send it back. When Alexa is done executing those commands, she goes back to sleep. We take as given that any non-meaningful speech is discarded.
Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com
Now, let’s change one thing about how Alexa functions. What if the sound that turned on Alexa wasn’t the word “Alexa” but the sound of gunfire? Or not gunfire, but any supersonic ballistic sound profile? There’s always some fine-tuning, but let’s take as a given that we figure out a common sound signature for most guns that Alexa will register as her wake-word. If there’s a gunshot near Alexa, she now knows that it is time to turn on. You could even fine-tune this with geo-tagging so that Alexa knows it’s not a problem if she hears a gunshot at your shooting range, but that it’s a big problem if she hears one at your bank.
Now, what does Alexa do with the fact that she has been turned on by gunfire? Well, it probably just makes sense that Alexa will call the police and tell them why she turned on and what address she’s registered to. Depending on the sensitivity of her sound profiles, she might even be able to name the type of gun she heard.
The important part is that if you ever fired a gun and there was an Alexa nearby, it would be the same exact thing as calling the police and telling them where you were. Imagine the impact that alone would have on the use of firearms in crime. All for something you can buy for about $50 online with free two-day shipping.
There are more possibilities here.
What if Alexa activated upon hearing screams? Or when someone shouted “Help me!” or “He’s got a gun!” or “Call 911!” We could make one unspeakable word, and teach it to all children. Never to be spoken except in dire need. We’d call it the help word. That way, if anyone ever tried to steal a child, all they would have to do is scream for help, for help to start coming their way.
In the Phase I world, calling 911 would mean the same thing as stating out loud that you need help.
With a little more software engineering, if two Alexas heard a gunshot they could probably triangulate the position of the shooter and send their exact coordinates to police. Or the position of the person calling for help. Oh, it would probably be pretty hard to get all the edge cases, but I bet you could get some if not most. Acoustics is a science we’ve understood for a long time.
That would huge, wouldn’t it?
Passive 911.
The craziest part? Amazon could probably do all of that, today, with a Software update.
PHASE II: Machine Probable Cause
Phase I would be a pretty miraculous technological delivery, and a pretty good perk for people who mostly just want an alarm clock that they can set with their voice and use as a radio.
Except, all of the above relies on something having already happened, or a human identifying a problem. Ideally, we want to intercede before the perpetrator even knows we’ve seen them. Not pre-crime, really, but a time where we can intervene between willful intent and harmful action.
Let’s take the modified Alexa from above, and give her a pair of eyes.
We’re going to give Alexa a really interesting pair of eyes, though. Alexa is going to see in a 360 degree arc all around her, and also probably “she” will have more than one physical location. All of the Alexas in one location will share a common network so that they construct one big virtual map of their location that they share with one another. If one Alexa is destroyed, the others know to call the police. Also, we’ll have to put all of these Alexas someplace high so their field of view is as wide as possible. Say on the ceiling, or on top of flagpoles.
Although the computational requirements are much greater, image recognition technology has been growing at an incredible rate alongside sound recognition like that used by Alexa. Now we take Alexa and give her the ability to turn on if she sees a gun, or a person in any kind of mask, where they are not supposed to be.
Again, we’d use geo-tagging here so Alexa doesn’t freak out if someone is wearing a balaclava at “Bob’s Tactical Training Tourist Trap.” We can also be more specific with responses since the primary function of this system is to be a security system. If you’re wearing a mask, Alexa will ask someone to come talk to you to figure out why you’re there. But if you have a gun? If you have a gun and you’re in, say, an Elementary School, Alexa will immediately dispatch the police and keep precise track of your location. Not only that, Alexa will speak to call out your location to police as well as the students and staff of the school.
Imagine you are a would-be shooter and while you’re trying to sneak up and do your thing, a voice is blaring “He’s in the first floor hallway heading toward the West Entrance” or “He’s outside room number XXX.” Or better yet, a human being takes over the monitoring remotely and feeds this information directly to the police leaving the shooter unaware.
Phase III: Semi Autonomous Non-Lethal Weapons
“When seconds matter, police are minutes away” ~The NRA
“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” ~Benjamin Franklin
The above enhancements, to an essentially Alexa-like product, represent an incredible boost in our security and response capacity as a civilization. But the fact remains, we’d still need to wait for a human defender to intervene in the above cases and while we can probably cut the death toll down significantly we probably won’t be able to hit our goal of zero deaths.
Let’s stop calling this device “Alexa” for now, because we’ve gone completely outside of the scope of a Digital Assistant. Let’s call this Phase III device “Minerva.” The biggest change for Phase III is that Minerva has weapons.
Did you back just go straight? Are you thinking “this is where he loses me?”
We can all agree Skynet is evil and the Terminator didn’t portray a positive view of the future. We don’t want Minerva to become Skynet. We should absolutely not give Minerva the ability to fire spent uranium shells or crush human skulls. But what if all a Terminator/Skynet/Minerva could really do was incapacitate you for a few minutes? Or maybe leave you with a headache? That’s not too bad, and that seems like a fair trade to stop mass murder.
Let’s give Minerva a few things that your mom probably has in her fanny pack when she goes on a walk through the scary part of town. Some pepper spray, a green laser, and a taser. Non-lethal weaponry designed only to incapacitate and disorient and better yet: all approved for use by Civilians today.
In the closed confines of a school, you could use pepper spray or a taser to almost totally incapacitate a gunman long enough for adult defenders to respond. Especially if Minerva can distinguish between foe and helpers and let the helpers know when it is safe to approach and take the assailants weapons. There would probably need to be some remote monitoring of the system for that to occur, and maybe to help with targeting, but it’s doable and we wouldn’t expect resource constraints on the monitoring given how rarely such events occur.
A green laser, as opposed to more dangerous wavelengths such as blue, has the ability to temporarily blind a person for about fifteen minutes by photo-bleaching the color receptors in their eyes. A laser, unlike pepper spray or a taser, also has the ability to fire on anything in its line of sight. If two Minervas had been present at the Vegas shooting, theycould have triangulated the sound of the gunman even in a 32nd story window, and blinded him as well as literally pointed an arrow at his location for all the police responders.
Who knows how many lives might have been saved if the gunman had been immediately blinded.
By the time he got off one shot, Minerva would know to look for him.
Phase IV Minerva Gets Her Wings, Guns become an Obsolete Technology for Personal Defense
Let’s take Minerva as she was in Phase III and add one additional change. Let’s give her wings. Or probably rotors. In any case, Minerva can fly now. Not always, of course. Batteries are lousy and lift is expensive, but we give Minerva enough power that she could patrol a neighborhood if she was reacting to something. Let’s shoot for fifteen minutes of high performance. There’s a Minerva every couple of blocks, high up on a telephone pole somewhere, waiting for a signal. Minerva is a semi-autonomous drone now.
Now, let’s put Minerva on a patrol car. When a cop pulls up somewhere an the situation is scary, they press a button their steering wheel and it launches Minerva. In the future of Minerva, cops are no longer allowed to insert themselves into situations dangerous enough that having a gun makes sense. So police officers would no longer carry guns. They’d be too much of a liability.
In the distant future, if a gunshot happens and is determined to be violent, Minerva will neutralize the shooter by the time the police arrive. As more and more Minervas are produced, the likelihood of this gets higher and higher. By the time the police arrive, their only job will be to slap on some handcuffs and put the assailant in the back of the squad car.
In the future of Minerva, guns probably won’t even make sense anymore except as a hobby.
In fact, let’s explore that. Why do guns make sense?
Let’s look at this question like it’s an engineering problem.
1. People other than you exist.
2. Some of them will have bad intentions toward you, and intend you harm.
3. Some of them are more skilled in combat and/or larger than you.
4. You cannot defend yourself with your native powers in all situations.
5. Asking for help doesn’t necessarily mean someone will come.
6. The person who comes to help might not be stronger than the person trying to hurt you.
7. It takes time for the person who will help you to get there.
8. You might get hurt while waiting for help.
I call this set the “‘Mano y mano’ Problems.”
9. If many people have guns, it makes a population harder to conquer and therefore deters tyranny.
10. It’s the best tool we’ve invented to equalize force between groups
I call this set the “Citizen vs. State Problems.”
To be fair, we should also look at why guns do not make sense:
1. Guns are naturally off, so you have to find a gun and then turn it “on” when you need one
2. Turning a gun “on” is incredibly binary and doesn’t allow for significant warning
3. Guns require training to be used effectively
4. Situations in which guns are most useful are situations where operators are least reliable
5. Anyone can use them, without regard to intent
I call these the “Shooter-Centered Problems.”
6. They don’t hit what you intend to hit in all circumstances
7. Sometimes you hit an innocent bystander
8. Effects of being shot are lethal, long-lasting, and/or expensive to treat
9. You have almost no control over the level of force applied
10. You can’t unshoot a bullet.
I call these the “Unintended Problems.”
Now let’s look at what Minerva does:
A) Offers immediate force neutralization upon the initiation of violence on a timescale approximately equal or less than that of finding a firearm
This solves for problems 1–8 or the “‘Mano y Mano’ Problems” and also eliminates the “Shooter-Centered Problems” because Minerva is a machine and inherently non-lethal and would eventually be near omnipresent. Minerva doesn’t get nervous. Minerva doesn’t sleep. Minerva’s vigil is never down. If you destroy Minerva, another piece of her will come.
Also, because Minerva is non-lethal, the effects of her unintended consequences are much lower. Minerva can only hurt you for a little while, and I think we all agree that a bunch of children being accidentally pepper sprayed and spending a few hours in the nurse’s office with a hell of a story is a better outcome than being shot and killed.
B) De-centralizes enforcement of rule of law
At first glance, it seems like Minerva does a lousy job of solving for problems 9 and 10. How could we initiate something like Minerva without unleashing a dystopian surveillance state? Wouldn’t all the Minervas end up in the hands of the state and be used to suppress rioting/revolt? Or to spy on people?
These are all problems, to be sure. The widespread acceptance of Minerva would require radical legislative change. For one, Minerva would be privately owned but we’d have to make it absolutely illegal, with severe penalties, for people to try to make a Minerva lethal or reprogram it for sinister purposes. It’s conceivable a single individual with a lethal drone army could conquer nations not similarly armed. That can’t be allowed. Also, Minerva should be systemically unusable as a stealth surveillance system. Meaning, even if you wanted, you couldn’t use her to record and spy on people. Penalties for attempting to do so, would likewise need to be severe.
In addition, to balance power, police should not be able to carry fire arms in their capacity as police. There’s no purpose for this in a world with a comprehensive and universal non-violent response. It makes sense for a cop to carry a gun today, but at some point in the future it won’t be anything but a symbol of tyranny. A police officer should never be in a situation where there’s a possibility of lethal response. Minerva is inherently deescalating.
The potential benefits are even greater if you think about prison. Does prison even really make sense if it’s pretty much guaranteed that people can’t hurt each other? If you tried to do something bad, depending on what it was, and nobody got hurt and you reasonably knew you’d be stopped, does it make sense to lock you away for that? I don’t think so. Probably, in the future, prison would only make sense for doing really bad things, or if you got through the Minerva safeguards and successfully hurt someone.
Now, we arrive at my dream.
We could live in a world with so little murder, that if anyone was murdered anywhere that it would be a matter for international politics. And every time the system makes a mistake, we could provide an over the air software patch to close the loop-hole. Minerva could be anti-fragile.
Parents could send their children outside and tell them to be home by dark and no one would even need worried. The people who would hurt children would know they can’t succeed. A kid listening to his step-father beat his mother only needs to ask for help, and there would be no need for anyone to answer who specifically called 911. We could take the best and bravest part of us and encode it in silica, so that if a loser with a gun wants to show the world how terrifying he is raises that gun to his shoulder he will be immediately blinded. The 9 year old lives. Gabby Giffords never has to learn to walk and talk again. The mother of the Sandy Hook child let’s out a sigh of relief when her daughter rushes to her once the school lock-down ends. Every parent looking for a child finds their child. No fatalities, mom, she says. And they all watch as the would-be killer is put in handcuffs, wondering how he failed even to die in a blaze of glory.
That’s my dream, and I can’t figure out why I’m such a godawful coward that I’ve never tried to make it real.
Phase V: What about War?
“I do not know with what weapons World War III will be fought, but World War IV will be fought with sticks and stones” ~Albert Einstein
I’ve thought about autonomous drone warfare a lot, and I find Einstein is still correct. Whatever weapons we’ll have in the coming years, if the nations of the world ever came to all-out conflict with advanced weaponry, all of modern technology would probably be erased in a matter of days. I think it’s a really stupid idea to build a version of Minerva that’s lethal because of the threat of AI Superintelligence, but it’s pretty dumb for lots of other reasons as well.
I believe it is also banned by treaty.
In other words, I can’t imagine this making the prospect of WWIII more likely to result in sapiocide than it does today.