Tech’s Adversaries vs Enemies

Alex Stamos
10 min readJan 13, 2020

--

This is the speech I was honored to give at the Winter 2019 commencement for the University of California, Berkeley School of Information. This included the first cohort of graduates from Berkeley’s Master of Information and Cybersecurity program.

UC Berkeley’s clock tower, the Campanile, at sunset.
The Campanile at sunset. CC BY-SA. Source

Greetings graduates, family, faculty and distinguished guests.

This is what I want to share. It doesn’t matter how far you might rise. At some point you are bound to stumble because if you’re constantly doing what we do, raising the bar. If you’re constantly pushing yourself higher, higher the law of averages not to mention the Myth of Icarus predicts that you will at some point fall. And when you do I want you to know this, remember this: there is no such thing as failure. Failure is just life trying to move us in another direction. And the key to life is to develop an internal moral, emotional G.P.S. that can tell you which way to go. Because now and forever more when you Google yourself your search results will read “Harvard, 2013”.

So, if that opening seemed a little generic and perhaps targeted at the wrong graduating class, it’s because I didn’t write it. This is my first commencement speech, and my nervousness in addressing you today led to some pretty serious writer’s block. You folks have all worked very hard to make this day happen, and I have been, perhaps mistakenly, granted an opportunity to retroactively ruin your memory of all that work as well as embarrass you in front of your family members.

So, I did what any self-respecting techie would do in this situation: I threw some machine learning at the problem. While on a flight home yesterday, I trained OpenAI’s GPT-2 text-generation neural network on the celebrated graduation speeches of Oprah Winfrey, Steve Jobs and David Foster Wallace, and I let the AI write my opening paragraph. What you heard was actually my second attempt; I first tried training the model on the top ten graduation speeches of all time as chosen by Time Magazine, but the inclusion of Winston Churchill and General George Marshall caused the AI to insert unnecessary and unsettling references to the German menace.

Even with a reduced training set of more contemporary graduation speeches the ML generated speech was not going to be bearable for the expected 15 minutes of commencing everybody expects, so I guess I’m on my own from here on out.

I’ll start with some career advice for the newly minted graduates of the School of Information: cybersecurity and online safety are great fields to go into. Why? Because they are the only parts of computer science and engineering that actually get worse every year.

In every other way, computing technology has gotten pretty damn amazing. Every adult in this room is carrying around a pocket supercomputer with more compute power than all of NASA had when they put a man on the moon, storage equivalent to the printed holdings of a large university library, and a constant connection to billions of other pocket supercomputers and much of the collective sum of human knowledge. Not a single aspect of our lives has not been touched by the unstoppable progression of computers getting more powerful, smaller and cheaper.

So why then do security and safety get worse every year? Well, it turns out that the same advances that make computing more powerful also make it more complicated and fragile. The same abstractions and high-level languages that allow a much larger number of software engineers to be productive also allow them to operate with very little understanding of how these systems work many layers down. Most importantly, the incredible decrease in the cost of compute has led the companies that make and sell these products to push technology into every nook and cranny of modern life. And when you insert technology into the messiness of human existence, you end up creating the possibility of small technical mistakes being amplified into suffering and tragedy.

This is why you graduates are going to spend your careers dealing with a whole new set of dystopian realities that were not on the radar of anybody in my graduating EECS class 19 years ago. Even those of us who were really interested in security at the time weren’t considering the possibility that we would be collectively responsible for malware-vulnerable lightbulbs or working people losing their life savings from compromised email accounts. We certainly didn’t foresee tech’s involvement in teenage bullying or the online fomenting of ethnic violence. Our generation of technologists were not ready for these possibilities, and I’m sorry to say that we’ve left all of you with a little bit of a mess to clean up.

One of the other reasons that security and safety are getting worse is that, in contrast to other fields of engineering, there are people who are paid to spend their days making it worse. We often call these “adversaries”. Adversaries is a great word. It’s the word cybersecurity professionals like to use for the humans, perhaps sitting at keyboards thousands of miles away, who are trying to break into the systems we protect or the software we write. The word is not inaccurate, although it is sometimes deployed unnecessarily and performatively. In those situations, you can hear the implied italics in the intonation, as if the speaker is just right then, live in front of you, deriving the term from the Latin root. Adversaries. It’s bloodless. Professional. Neutral, in that it can be used for anybody from a teenager in a basement to a signals intelligence officer drinking government coffee in a quiet office park. Using “adversaries” identifies the speaker as a member of the cybersecurity initiated, and, most importantly, sounds a lot better than “bad guys”. I use the term adversaries almost every day.

But we need to be careful to not let our adversaries distract us from our enemies. These words are not synonyms. Our adversaries are people. People who come and go depending on our job, their job, what’s happening in the world to drive certain conflicts and what products we have shipped. Our enemies, on the other hand, are the things that hold us back from doing better. And the real enemies in the technology world are arrogance, complacency, and a lack of empathy for those we are supposed to protect. And if you give me a moment, I would like to explain your enemies to you, so that you may someday defeat them in a way those of us who came before you could not.

I think the arrogance of the tech industry is pretty self-evident. It is arrogance to believe that our skills at building devices that are, fundamentally, just really fast versions of the graphing calculators our parents were forced to buy and at talking to those super powerful graphing calculators in funny graphing calculator language make us qualified to destroy and rebuild, excuse me, I meant “disrupt” every facet of society. It is arrogance that has led us to applying the minimal viable product strategy of shipping products that transform areas of human endeavor informed by hundreds of years of philosophical debate. It is arrogance that causes us to use the term “artificial intelligence” to describe brute-force statistical methods that, as you heard from the opening paragraph, are barely capable of performing a half-assed job with potential failure modes that we can’t possibly understand. That’s arrogance.

Complacency is the acceptance of a status quo that, if properly reexamined with fresh eyes, would lead any ethical software engineer to retire to the mountains to build children’s furniture out of fallen redwoods instead of ever again invoking a compiler. You see this complacency everywhere you look in the modern world of tech. Every time you put a username and password into a website that does something critical, you are experiencing the collective complacency of the entire technology industry. The username/password paradigm is a relic of the timeshare mainframe days of the 1970’s, when a university or corporation needed to keep track of which department to charge for that CPU time or the use of punch card blanks. It is not a model of authentication that is at all appropriate for somebody who is using a pocket supercomputer to connect to, say, the computers that control their retirement accounts or most sensitive medical records. At my previous employer, we caught over 500,000 accounts being taken over every day because somebody reused their password. This is just what we caught on our own, with each individual integer representing a real person who might have their entire life turned upside down by a professional thief. Yet, due to collective momentum, our industry is still asking people to memorize hundreds of unguessable strings as a prerequisite to participating in the benefits of modern existence.

And that brings us to the lack of empathy. Empathy is the ability to put yourself in somebody’s else shoes and to feel what they are feeling. As one of my favorite authors, Neil Gaiman once said (and I’m talking about the real man this time, not an AI imposter) “Empathy is a tool for building people into groups, for allowing us to function as more than self-obsessed individuals.” Despite what you might read in the papers, I don’t think that technologists are uniquely unempathetic to others. Many of my colleagues have demonstrated themselves to have been incredibly thoughtful, emotionally available and empathetic. The challenge for all people is having the same empathy and understanding for those who live very different lives from us in very different circumstances. This is a universal shortcoming, which you often see reflected in difference in how we react to tragedies that are close to those affecting people in a far away land. It’s something we should all strive to improve about ourselves. But in the case of technologists, like all of you graduates, the challenge of understanding the plight of others can make us much worse at our jobs.

We are not the first people in history to be asked to apply specialized skills to improve the lives of others. Engineers, scientists, doctors and many other professions have demonstrated responsibility to others, whether that was a responsibility to build a trustworthy bridge, deliver safe drinking water, or prevent disease. These professions often have the benefit of direct relationships with those who their serve, or at least are part of the culture where their work will have the greatest impact. That can make having empathy relatively easy. This is much harder when you are building something that millions or billions of strangers might use, especially when those strangers speak hundreds of languages, come from a multitude of different cultural backgrounds, and are interacting with modern information technology in a way that seems completely alien to somebody able to complete a Masters Degree in Information from the world’s greatest university. Technologists almost always end up building technologies for ourselves, or at a maximum, for people with whom we are immediately familiar. I’ve heard so many of my colleagues say something along the lines of “we should make sure this is usable by our parents”. Our parents, some of whom are in the room, are not at all reflective of the diversity of needs and experiences of the people who use technology. Even if we are trying to build for our parents, we are failing, as is proven by the fact that every single graduate sitting here has had their Thanksgiving or summer breaks ruined by the need to fix the WiFi, clean malware off the family desktop or have a difficult talk with their Dad about forwarding what he saw on Facebook to the family group chat.

So what can you do about these enemies? The enemies of arrogance, complacency and a lack of empathy. The enemies that might rob you of the only thing that matters in your career, the ability to look back and honestly say your time was spent helping others. The only advice I can give you is to try to be aware, of both yourself and the culture within which you work. Corporate culture is the invisible intellectual air that nourishes us and feeds us and makes our work possible. Corporate culture can also infect us and help us rationalize decisions that we would have never made on our own. I don’t claim any special ability to resist culture in this regard, as I certainly have been a part of decisions that I look back upon with regret.

It is seductive to go along with the expectations of your boss, your colleagues, your shareholders, which you must resist. It can also be seductive to put yourself on a path where you might never be faced with hard decisions that you might regret or where you are free to always criticize without taking any ethical risks on your own. I regret some times when I could have spoken up, or been more forceful and eloquent, and I imagine alternate scenarios where i could have been more effective fixing the issues I care about. What I don’t regret is putting myself in the room where the decisions are made, and I’m trying to resist the nihilism that can come from looking back and seeing myself fail by remembering the times when I made a small difference. Changing a company, government, industry, or society from the inside is hard. Deciding to get on a massive ship with a tiny oar in your hand and telling yourself that maybe you can help it avoid the iceberg is terrifying. Doing so is not a choice I can recommend much less make for you. But if you choose a direction where your work might have great impact, if you can have the self-knowledge to understand why you and your colleagues are failing and if you have the courage to confront people who you otherwise like and respect, then you have a chance at vanquishing your enemies.

So, let me close by wishing you the greatest success in your career. Success not just measured in financial reward or public accolades, but in knowing yourself and feeling confident that your efforts have made the world a slightly better place. Or as my good friend, a statistical-model-built-from-the-words-of-better-graduation-speakers-running-on-a-graphics-card once said…

In closing I wanted to tell you that I consider today as I sat on the stage this morning getting teary for you all and then teary for myself, I consider today a defining milestone in a very long and a blessed journey. The “Oprah Winfrey Show” was number one in our time slot for 21 years and I have to tell you I became pretty comfortable with that level of success.

Thank you, congratulations, and Go Bears!

Students waving a Cal flag
Go Bears! CC BY-SA. Source

--

--

Alex Stamos

Teaching and researching the misuse of technology at Stanford. Former CSO of Facebook and Yahoo.