Email, Algorithms, and the Vulnerable

For a brief moment last year Silicon Valley’s hottest new product was an email app called Superhuman. The email service costs $30/month (!) and their marketing sounds like the first draft of a voiceover for a luxury car commercial:

“Superhuman is not just another email client. We rebuilt the inbox from the ground up to make you brilliant at what you do. We specifically designed it for those of you who want the best…Superhuman is so fast, delightful, and intelligent — you’ll feel like you have superpowers.”

(Side note — I love how they couldn’t resist explaining the punchline of their own product name even though it’s the most obvious thing in the world.)

Superhuman started their marketing blitz last June — venture capitalists evangelized the app on Twitter and the New York Times published an article called “Would You Pay $30 a Month to Check Your Email? One of Silicon Valley’s buzziest start-ups, Superhuman, is betting its app’s shiny features are worth a premium price.”

One of those “shiny features” is what Superhuman calls “read receipts.” While the New York Times failed to mention any details about the feature, early-access Superhuman user (and former VP of Design at Twitter) Mike Davidson wrote a 4,000+ word blog post about it: Superhuman is Spying on You.

In Mike’s article (which is one of the most nuanced, thoughtful reflections I’ve ever read on how product decisions get made in Silicon Valley), he explains everything that’s wrong with Superhuman’s “read receipts” feature:

“You’ve heard the term “Read Receipts” before, so you have most likely been conditioned to believe it’s a simple “Read/Unread” status that people can opt out of. With Superhuman, it is not. If I send you an email using Superhuman (no matter what email client you use), and you open it 9 times, this is what I see: a running log of every single time you have opened my email, including your location when you opened it.”

Meaning: if I use Superhuman to send you an email, I can see when, where, and how many times you opened my email — regardless of what email app you use. Without you knowing. And to make matters worse:

“Superhuman never asks the person on the other end if they are OK with sending a read receipt (complete with timestamp and geolocation). Superhuman never offers a way to opt out.”

In his post, Mike imagines three short stories to highlight the potential for abuse enabled by this feature. I’ve excerpted the first sentence of each:

  • “An ex-boyfriend is a Superhuman user who pens a desperate email to his former partner.”
  • “A pedophile uses Superhuman to send your child an email. Subject: “Ten Tips to Get Great at Minecraft”
  • “Superhuman decides they can make more money by supplementing their subscription fees with data licensing agreements.”

😳

I’m sure that no one at Superhuman wanted their email product used in that way, which raises two questions:

  1. Did anyone at Superhuman think about the potential problems of exposing location data without consent?
  2. If someone did raise concerns, why were they ultimately overruled?

Unfortunately, Superhuman’s oversight isn’t an anomaly. There are countless examples of how Silicon Valley’s blindly optimistic view of tech (among other attributes) creates “unintended consequences” that can wreak irrevocable harm.

Perhaps the most famous example: content recommendation algorithms. What was originally designed to help you “discover” (to use Valley terminology) things you’re interested in also enables the rapid spread of misinformation and the development of extremist views.

Or ad targeting. What was originally designed to show you the perfect pair of shoes also enables the spread of political propaganda and fear-mongering to the most susceptible demographics.

In Superhuman’s case, there are several simple solutions to the problem they introduced: they could require users to opt-in to sharing their location, or remove the feature altogether (though Superhuman did change the feature in response to Mike’s post, he still felt that the changes were inadequate).

But as more and more of the core technology woven into our lives becomes software oriented and algorithmically driven, finding solutions to “unintended consequences” gets thornier. It’s sometimes literally impossible to understand why things are happening the way they are. The algorithms are black boxes.

As Abeba Birhane, a PhD candidate in Cognitive Science at University College Dublin, writes in The Algorithmic Colonization of Africa:

“Data and AI seem to provide quick solutions to complex social problems. And this is exactly where problems arise. Around the world, AI technologies are gradually being integrated into decision-making processes in such areas as insurance, mobile banking, health care and education services.”

The issue with this change is that

“Society’s most vulnerable are disproportionally affected by the digitization of various services. Yet many of the ethical principles applied to AI are firmly utilitarian. What they care about is “the greatest happiness for the greatest number of people,” which by definition means that solutions that center minorities are never sought.

But their voice needs to be prioritized at every step of the way, including in the designing, developing, and implementing of any technology, as well as in policymaking. This requires actually consulting and involving vulnerable groups of society, which might (at least as far as the West’s Silicon Valley is concerned) seem beneath the “all-knowing” engineers who seek to unilaterally provide a “technical fix” for any complex social problem.”

Birhane’s suggestions for a creative process sound a lot like what I’ve been reading about Jesus in the Gospel of Luke. Jesus says that he came

“to proclaim good news to the poor…to proclaim freedom for the prisoners and recovery of sight for the blind, to set the oppressed free.”

That does sound like good news. If I could do those things, I’d like to do them. But it also sounds lofty. (I’ll refraining from making a “if Jesus was making apps he wouldn’t collect your location data” joke oops I just made it)

I’ve spent this email pointing the finger outward. It’s time to point the finger back at myself. Superhuman, algorithms, Birhane, Jesus…

How am I pushing myself into the edges, to consider minorities, to involve vulnerable groups of society?

This post was originally published in The Valleyist, my newsletter exploring the intersection of technology and spiritual faith. You can subscribe below:

This newsletter is published ~2x/month

Designer, born and raised in Tokyo.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store