The Real-Life Consequences of Algorithms: Not just the election, but what we’re missing every day

Years ago, this line in a Psych textbook captured my attention: “At this stage of development, a four year-old child sees the leaves on a tree moving, but misunderstands causation: He may think that the leaves moving cause the wind to blow, not the other way around.”

That’s kind of charming, right?

But you know what’s not so charming? When we think that black teenagers are thugs and are causing white police officers to kill them.

Or women gamers who speak up about objectification are causing men to dox them and threaten to rape and kill them.

Or that the President is causing mass shootings so he can justify taking away automatic weapons.

Or that immigrants are laying in wait to threaten our freedom and kill us, so we should build a wall.

Way less charming.

When children misread cause and effect, we see it as a natural, if primitive, understanding of the world. They’ll grow out of it — and learn that winds are a powerful invisible force, drawn to vacuums in nature that they fill and shape in sometimes very visible ways — tornados, dust storms, or leaves rustling on a blustery day.

Collectively, it feels like we are using the Internet like four year-olds. But we’re growing into it — not out of it.

Thank you, algorithms. It’s kinda perfect that we call them our social feeds: they are what we’re ingesting and live on. Unfortunately, we’ve created 7 billion microclimates — monocultures, really — that harvest only our most comfortable views, whether happy or poisonous, nourished with every click. And it’s not healthy: Biodiversity may’ve helped the earth thrive for 4.5 billion years, but it’s now possible for us to live our whole lives without seeing a different opinion. Or face.

And there are real-life consequences. Not just the election, but in how we navigate the world: whom we trust, whom we fear, whom we hire. It’s not that algorithms are inherently bad. It’s that instead of an objective, shared truth or history, algorithms are giving us a version of the truth filtered by our other clicks, likes, friends, purchases, secret fears, and biases.

I call it the leaves rustling, making the wind blow.

This is deeply personal to me. And it’s already affecting you, too. I’ll get to that in a moment.

In theory, algorithms work exactly as they should. You click on something, it registers. You see more of what you like. That’s the unscientific explanation of a recommendation engine:

Like + Click = More of What You Liked

But, see, what happens editorially — at the source, on the editor’s desk or terminal, probably? The editor has a staff of people to support, and in many newsrooms, those people are paid by your clicks. Even if their staff faithfully reports an objective truth … if it’s not on FB, are we certain it happened? If it’s not popular/shared, do we still need to pay attention?

News used to travel slowly: pictures were rare. Stories were reported, disseminated and analyzed, and at some point became our shared history. Now, though media is democratized (which sounds noble) we are buried under an avalanche of “real time” images and news.

To get us to click, the stories, headlines and images aim precisely at the intersection of our beliefs, fears and the stuff we don’t know from personal experience, but worry may be true.

There’s so much to see and know, and we know it all for at least a second or two. Then we forget. I mean, really — look at that picture. How many people even remember that we had a crisis of refugee children coming in from Honduras? Or that they gave no one Ebola?

When we click, we think we’re making a judgment about others. But somewhere, a machine is making a judgment about you. And what you click on becomes more of what you see. Each click fills a void as the machine learns more about you. A picture with one perspective becomes more relevant than one with a different take. And it becomes the truth.

So? If you clicked on the picture of boys looting at Ferguson, but not the boys trying to save a store from looting? That tells the algorithm what to feed you. Whether you are more profitable when motivated by fear, or by goodness. And it’s what you’re going to see more of.

The more emotional, surprising or outrageous the claim or image, the more likely we are to click. The writer Joseph Conrad called it “the fascination of the abomination” — and brain research has proved our tendency towards negative bias. Inbound marketing calls it “exploiting the knowledge gap.”

Whatever you click on, you are making an editorial decision on what is true, whether you realize it or not. In the process, we are losing an objective truth to filter our world by, and we are paying dearly for it.

Asking for a friend: Do our clicks become our truth? Even when they’re not the whole truth? Can they influence an election? (See: perpetual email-server stories and “crooked Hillary.”)

But I said it’s personal, so forget politicians and celebrities and shock for a moment. Can we even fairly see the people we meet every day if our filters are changing how we see them?

Just last year New York Times reporter Claire Cain Miller found that our algorithms — supposedly unbiased — are in fact biased.

“Research from Harvard University found that ads for arrest records were significantly more like to show up in searches for distinctively black names or historically black fraternities.”

— Claire Cain Miller, When Algorithms Discriminate

We wring our hands and devote blogs and stories to “the pipeline problem,” as if there’s a dearth of brilliant-people-who-aren’t-white-or-male: We just can’t find them! But we are a meritocracy so they must not be good enough.

Remember: It’s in the mental places where we DON’T have personal firsthand experience that we are most vulnerable.

We are segregated by our Likes. No, scratch that; we are segregated by our fears.

Then again, it’s not like we all see the same news, even on a topic that seems like it doesn’t leave room for interpretation. Like, say, emails. Or this:

This part isn’t new. If you saw Upworthy founder Eli Pariser’s TED TALKS on “filter bubbles,” you know that algorithms already sort search results and news based on what we like. I mean, “Like.”

What’s new is that the consequences increasingly are no longer an intellectual exercise — if they ever were. It’s not just “interesting” that we no longer have a shared, objective context through which to filter our world. It’s not “just politics” that maybe the barrage of algorithms and fake news altered this most recent election.

Just this weekend, Mike Isaac wrote a piece about soul searching amongst executives at Facebook, wondering whether they had influenced the election, amidst accusations and (some really great) investigative reporting showing hundreds of websites creating and spreading fake news on Facebook and the Web. And Mark Zuckerberg weighed in personally, reassuring us that 99% of what we see is “authentic.”

Dear Mark, algorithms influenced the election. Of course they did.

You’re like, “well, NBD. We just have to Google it to find the truth.”

Maybe you can find the truth. Unless someone bet that you are an impatient, distracted, gullible person — and SEO’d the results so that the actual truth is on page nine of your Google results instead of page one.

Not kidding.

In 2014, Rachel Aviv wrote a piece for the New Yorker about Tyrone Hayes, a scientist trained at Harvard and Berkeley who published peer-reviewed studies showing all kinds of nasty results in the groundwater where a chemical called Atrazine was deployed. Suddenly articles popped up calling him a bad scientist; strangers showed up to tape his talks. He publicly wondered whether the company behind atrazine was trying to discredit his work. And then MORE articles popped up — calling him delusional, a phony and even a paranoid narcissist for presuming that any company would care about his little experiments.

One of the towns downstream from where atrazine had been deployed decided to file a class-action lawsuit against the company. And — only because of the lawsuit — thousands of internal company memos were released.

You’re not paranoid if people are actually out to get you.

Suddenly he didn’t seem so paranoid anymore.

I’m a fan of Rachel Aviv’s work in general. But in this multi-thousand-word article were a couple of sentences that blew my mind:

Its public-relations team compiled a database of more than a hundred “supportive third party stakeholders,” including twenty-five professors, who could defend atrazine or act as “spokespeople on Hayes.” The P.R. team suggested that the company “purchase ‘Tyrone Hayes’ as a search word on the internet, so that any time someone searches for Tyrone’s material, the first thing they see is our material.” The proposal was later expanded to include the phrases “amphibian hayes,” “atrazine frogs,” and “frog feminization.” (Searching online for “Tyrone Hayes” now brings up an advertisement that says, “Tyrone Hayes Not Credible.”)

(Note: As of this writing, the second result on a search for Aviv’s piece brings up a Forbes piece by Monsanto’s Jon Entine, trying to discredit Aviv’s work. I mean, it’s so meta that I can’t even.)

The truth? Apparently the truth can be purchased. And hidden. In my firm, we have a rule: no spin, no lies. Find the true story that you can credibly tell and make it great.

Elsewhere, it’s called something more like, “protecting a client’s interests.”

Well, that’s SEO, you say. Everyone knows that.

But does everyone know that?

Which Tyrone Hayes you believe — the accomplished scientist whose work led him to surprising results, vs. the paranoid narcissist — may depend on what Google thinks you’d Like to believe.

I said at the beginning: this is deeply personal to me.

If we measure the people in our lives by their influence on the person we become, I can say that Maree Trice was one of the best friends I ever had.

Maree — in all her magnificence and glory.

I grew up in an integrated neighborhood, and my friends look pretty much like America. So it’s not that she was my only Black friend. Far from it.

It’s that she changed me forever.

Maree was brilliant, curious, hilarious. She had a cat named Featherhead J. Negro. She loved blues guitar and science fiction. She was a nerd. She knew how to code — back when it was still called programming. I mentioned that she was kind. But she also had no tolerance for bullshit.

She taught me how to respect myself, how to not apologize for my opinions. I learned that I could be kind, but also not mince words; I could be gracious and still know when it was time to just walk away. I owe the woman I am today to her friendship.

She was at once the most hopeful and most bitter person I ever knew.

Hopeful, because she was certain the world would see beyond the surface to the person she was and to all she had to offer. Because she’d done everything right: — Got out of the ‘hood. Got her degree. Tried to carve out a life for herself where her talents would be valued. Hopeful to be more to you than just her bigness or blackness or womaness.

Bitter, because Maree was perpetually passed over for any job that utilized her intelligence and her coding abilities. Yet she didn’t want to give up. She had hard-won skills that were in high demand.

We say that there’s a “pipeline problem.” We say that there are no Black women programmers. And we all say, it’s just a matter of trying. But it isn’t.

Maree tried. Countless times. And as her life unfolded, she was never hired in a single tech role, ever.

And yes, to an extent, I blame the images — the ones that dwell on a shooting committed by a Black person, but skip lightly over one committed by someone of almost any other race. She seemed scary: not just her blackness, but her refusal to style herself as a sweet Black woman, the kind that we apparently are more comfortable with as a white person’s witty sidekick than as a force of nature, all on her own.

She could have been the difference for some early tech firm that failed. But we’ll never know. They’ll never know.

She died a few years ago. It broke my heart that I had lost touch with her — she couldn’t afford a computer or internet access and wasn’t on Facebook. I would search regularly, and in vain. I didn’t find her until it was too late.

I found Maree’s obituary.

I’m not here to have you pity her. I’m here to tell you that you all missed out by not knowing her, by not employing her. Maree’s nimble mind, her willingness to pitch in and attention to detail, would’ve been great in so many jobs.

The people who could have hired her didn’t really see her, did they? When we see a Maree, a Tyrone Hayes, or Trayvon Martin, an Anita Sarkeesian — anyone we pattern-match into a group — do we see the people they are, with all their faults and gifts? Or do we see something the Internet filtered for us, because we clicked on the boys looting at Ferguson, and not the boys protecting Ferguson?

THIS is my point: your clicks and likes may be 1s and zeros — but they’re not abstract. Algorithms that feed us a diet of suspicion and images without context have real consequences, on Facebook or anywhere else.

They circumscribed Maree’s life by keeping her out of yours. They also circumscribed your opportunity to choose to work with her as a colleague, to know her as a friend, to honor her as a woman and as a person.

It is your loss, in every sense of the word.

So this is where I leave you.

Become your own algorithms. It’s awesome that FB and Google and others are joining forces to ban fake news. But it’s bigger than that. It’s everywhere.

Before you click, remember what you’re telling us about yourself; and ask if you’re creating the world you want to live in.

Becoming your own algorithms means choosing to not let online impressions tell you a story about someone who’s different; but also might be the same, might make your life better and become the best friend you ever had.

If you come across someone in your everyday life — at work, on the street, even sitting next to you right now–whose difference triggers in you a brief anxiety you can’t quite explain? Ask yourself if your reaction is based in your own personal experience, or if it’s based on something like the wind, an unseen force that blew through your feed, seeking the voids and based on nothing other than 1s and zeros.

And then? Make a personal experience, away from filters and screens. Find — and see — the Marees that are still out there, in whatever form they may take.

You will be better for it. I was. I am.

And we just might fix the world.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.