Harassed by Algorithms

Joanne McNeil
The Message
Published in
4 min readDec 31, 2014

--

A user recommended as “similar to you” might be dangerous.

Several years ago, long before Spotify, I regularly listened to music through Last.fm. I would visit the website frequently to learn about upcoming concerts and the “related artists” of musicians I liked. The website introduced a feature that lets you find people with similar taste in music with a score to identify how closely your tastes matched. Their avatars would appear in a sidebar. The people recommended to me remained consistent. I wrote to a few of them, although I never met anyone in person through that website. One recommended user in particular had a remarkable number of favorite artists in common with me.

Her avatar image was an obscure art object, not her face, and she used a handle instead of her name, but her location and circle of friends all suggested it might be someone I had heard about but never met. I am, with some certainty, sure it was the ex-girlfriend of the guy I was involved with at the time.

Right. Of course I’d discover her this way. He made us the same mix tapes.

Her avatar popped up every time I logged into the website.

These sorts of accidental algorithmic run-ins happen more frequently, often with startling insensitivity, and with greater potential for emotional distress.

A number of people have observed how Facebook’s “People You May Know” list can seem like a line up of people you indeed know, but would prefer not to know, whether they are former coworkers, estranged family members, or acquaintances you otherwise try to avoid at parties. Noah Michelson, writing for Huffington Post, noticed his “People You May Know” list is now largely populated with men he met through Grindr. Responding to Michelson’s piece, Grindr denied it shares user data with Facebook, but it is hard to track what is shared where and how. A sea of intermediaries might be the pathway to an eventual phone number match that Facebook will use to display a person’s profile as someone you “may know.” It is not hard to picture circumstances that might make someone panic. What if that one meeting with someone off of Tinder or Grindr was not a hook-up but a sexual assault?

I am particularly troubled by the way that Twitter recommends people to follow after you follow someone, because it is blind to the user herself. I have a number of inactive accounts on Twitter — ideas for bots I never got around to building or other projects I hadn’t moved on beyond reserving the domain and a Twitter handle (and then following my main profile but no one else.) Every so often I get an email that looks like this:

“The list of people to follow after you follow someone” sounds like the beginning of a contemporary update to a “my friend’s cousin’s wife’s grandfather’s best friend” joke. There is no easy way to describe what is happening, but it has real consequences.

Twitter also suggests “Who to follow” on a person’s profile page. Clicking on an acquaintance’s page on the Twitter app, I was shocked to see the first person listed among the three profiles was someone who sexually harassed her at her former workplace. She blocks his profile. Twitter’s “Who to Follow” overrides her own preferences. She can’t see that he is being suggested — advertised — on her own page. This data is not for her — it is for everyone else. Somehow Twitter has calculated an affinity score based on factors like how many people follow both of them.

If you are stalked by a former coworker, Twitter may reinforce this connection, algorithmically boxing you into a past while you are trying to move on. Your affinity score with your harasser will grow higher with every person who follows this person at Twitter’s recommendation. And you won’t even know, unless you, like me, have a few dummy Twitter accounts and haven’t unsubscribed from all emails.

A friend noticed that the first person to show up in the list of people to follow is usually a cisgender white man that gets more attention for doing the same work she does. Like the problem with Google algorithms defining “beauty” as whiteness per layers and years of discrimination, there is no way to amplify marginalized voices if structural inequality is reflected in our algorithms and reinforced in user pageviews.

Twitter is suggesting people to listen to, not people who are your friends or might have similar taste in music. There is professional interest in accumulating a large audience, Twitter followers are valuable for reasons beyond “social graph.” As the emails I received explain, “Following the ones you like will help you stay informed on what matters the most to you today and discover what might matter to you tomorrow.”

The method of classifying people as “similar” reduces us to objects that might be recommended, through data, like a book or kitchen item. We are more than our avatars but careless prompts like this create scenarios harder to control and prepare for, given how our personal and professional lives complexly intertwine.

At the very least Twitter users should have some control over who is suggested. Why not let users select the people themselves they recommend you follow after following them? Discussing this with my friend Melissa Gira Grant, she pointed out that we already #FF (Follow Friday.) If Twitter is going to recommend people, why not take their own word for it? This could result in a more diverse and more accurate list anyway.

--

--