Who would transparency in search algorithms benefit?

Jeremy Kun
4 min readNov 6, 2016

--

Angela Merkel wants algorithms to be more transparent.

I’m of the opinion that algorithms must be made more transparent, so that one can inform oneself as an interested citizen about questions like ‘what influences my behaviour on the internet and that of others?’

Google’s honestly excellent when you dig into the details. They let you control what ads are served to you. They tell you exactly who they think you are. You can download and delete the content that Google uses to filter for you. You can turn off filtered results.

Hear that? YOU CAN TURN OFF PERSONALIZED RESULTS.

You can also use incognito mode in Chrome. The unfiltered internet is just a Ctrl-Shift-N away.

People make it seem like technology companies force us to carry around this huge burden of all our life’s past decisions that hide the truth (or differing opinions) from us. Maybe Facebook is actually like that, but to generalize to algorithms as a whole is illogical.

Merkel and others seem to want technology to either go away so life can go back to normal, or else to have you a precise formula by which a user’s behavior produces a ranking on any given search result. Let’s ignore the fact that, even if the algorithm was the simplest possible reasonable algorithm, a linear regression—which isn’t even a ranking algorithm, but kids learn it in high school statistics — nobody would have a fucking clue how to interpret it because everyone is scared stupid at the slightest whisper of math. (Yeah, let’s ignore that education is the real problem behind the “endangered debate culture”)

Here’s what would happen in a world where Google’s algorithm was public knowledge.

  1. The overwhelming majority would behave no differently (nor would they notice, or care).
  2. SEO consultancies would go through the roof, as anyone with enough money to hire someone with some math smarts would game their way to the top of Google search results. Propaganda would skyrocket.
  3. People actively trying to harm you (with fake designer products, fake news stories) would screw everyone over.

This isn’t a joke, because it happened before. Just look to the textbook case of Vitaly Borker a man who used public knowledge of the Google ranking algorithm to not only game his way to the top of the search engine, but also to defraud, threaten, and stalk his angry customers. Read the longer story here for more.

The truth about algorithms is “Garbage in, garbage out.” And the more public an algorithm is, and the more money your business can make by gaming that algorithm, the more garbage people will throw at it to try to slant it in their favor.

Think of the privacy of Google’s ranking algorithm like a classified government dossier. It’s secret because bad people will harm you if it became public.

That’s not to say that we shouldn’t encourage private government audits of the ranking algorithm. And that Google should make efforts to make their algorithm more transparent for the sake of these audits—I know too well how useless it would be to stare at a table neural network weights.

And maybe it would even be nice if Google could provide some sort of guarantee that its results aren’t “biased.” Well, let’s be honest, this term doesn’t have a precise meaning, nor does it really make sense to enforce it in most situations. If Donald Trump grabs a woman’s pussy on live television, it would make zero sense to force Google to return an equal number of stories showing video evidence as those claiming it’s a made-up conspiracy. The front page of the news should not be full of stories about Democrats fixing polls by using “oversampling,” when the authors of the allegations are too stupid to know what oversampling means.

Saying Google’s search results shouldn’t be “biased” is a nice thought, but really we want something else. We want stories which are falsified to be removed and the author to be held accountable (by ranking later stories lower, say, and flagging the results as potentially untrustworthy). We want more reliable sources of information. We want careful deliberation, evidence, and expertise. We want transparency of institutions that have repeatedly lied to the public.

The problem is that the minute you do this, everyone whose stories get killed, or whose views don’t show up as often, will complain. The facts don’t matter to them, they just want to feel needed and appreciated and safe in their bubble and to yell at others for being different. It’s the human way.

But seriously, modern public distrust is due to equal parts bona fide government corruption, lack of accountability, and the sort of unintelligence (of the public) that causes one to leap to conclusions with a certainty that spawns an entire subfield of research. Sprinkle in a few dashes of “focus-groupping the shit out of a political platform,” and here we are. No issues to discuss, just tribes, empty words, and a bunch of people in Minnesota who are afraid of Sharia law, but really just afraid of being useless.

As I always say when my friends talk about how scary Black Mirror is: it’s not the technology that’s scary, it’s the people that are scary.

--

--

Jeremy Kun

Mathematics PhD, currently at Google. Author of Math ∩ Programming @MathProgramming