The Worst Art And Media Out There Seems To Get The Most Attention; This Is Why

Peter Coffin
8 min readOct 10, 2015

--

Algorithms That Sort Art And Media: How They Do It

They’re everywhere. Strings of math creating logic that decides what videos, articles, stories, photos, and status updates you would be most interested in seeing. Computers are predicting what decisions you would make if you were presented with the absolute overload of “content” that gets published to the internet every single day — and not even the final decisions, but preliminary ones about what to present for you to decide between. You probably know that most art and media sorting isn’t manual and you might have a hunch those increasingly-powerful computers are used to do it. “Is it The Matrix?” this writer asked as if he’d given into the hopeless inevitability that since this is the internet, he had to.

What, truly, makes “content” do well? Today’s rallying cry is “engagement!” It’s true, too — but it brings us to a quasi-chicken-and-egg dilemma. Common wisdom (almost never something to look for if you wish to find information that could be considered “wise”) is that if something has been engaged with, that means people are more likely to share it because they have tied themselves to it in some way, negatively or positively. This now isn’t just an artist or creator expressing themselves; it’s also the consumer expressing themselves and therefore an expression of their identity. This is a contributing factor to the success of content: people have traditionally been inclined to contextualize it with their own opinion to state who they are as a person relative to the creator of the original work. Social sharing is based almost entirely in the “here’s a thing, here’s what I think about it” mindset. Traditionally, this is how art and media has been spread.

So why not make systems that can do it for us? Hence, the Engagement-Focused Content Algorithm™ was born. This has evolved into what seems to be our primary method of sorting art and media with the intention of making our consumption decisions easier. It at very least is easy, too. Easy, however, isn’t always the best thing to be.

Like I said, the most-engaged content is the most likely to be shared, right? That means the algorithms people want to create are going to be based on predicting what is going to get shared the most via engagement. What seems most sharable then moves into “related” and “suggested” sections, sorted by their metadata (keywords, description, etc.), to find the “most relevant” material for you to consume. This is why you see that unending barrage of anime music videos after your cousin sends you the one they made and you clicked like on it because you’re nice.

Let’s say a video about warts goes viral (“viral” and “content” are words that annoy me but they’re the “right” ones to use). Another video creator, one with a lot of engagement, makes a video casually mentioning it somewhere between the other things they want to talk about. They orient their metadata to be similar to the viral video and include call to action about commenting and liking. Their viewers, who number considerably less than a legitimate viral video’s, engage at a higher percentage than other videos with similar metadata, so the video site puts it at the top of the “suggested” videos. The video gets a lot of views and, algorithmic, that creator is now second in line for all that sweet traffic.

Negativity And Abuse Are Built For Engagement

As people flow in from the viral wart video to the unrelated-but-the-system-thinks-it’s-related video, they notice that what they care about is only casually mentioned (or don’t even notice it has been mentioned) and feel annoyed — even betrayed. The user clicks dislike, thinking they’ve done their good deed for the day.

Here’s what’s going to bug the shit out of you, the kind, rational user: dislikes are also engagement. When you click dislike on someone’s work, sites think “this person feels strongly enough about this to take action and may share the video with someone else!” Don’t pretend you’ve never sent someone a video and said “oh my god, I hate this SO much! It’s SO wrong!” Companies are aware you do this — and you get served advertising whether or not you enjoy the video. Your outrage over the video directly contributes to the success of it.

People are more likely to click dislike than like, as well. Unless we really love a video, we aren’t likely to click like. But we dole out those dislikes like it is in fashion. This behavior we need to reverse desperately. If you like an artist or creator, even a little, you should be clicking like on their work. If you dislike an artist or creator, you should not be doing anything because if you do something you are directly contributing to their success. I’m not saying this to protect anyone’s ego, either. I’m saying this because algorithmically speaking, this is what you are doing.

If you don’t want someone to succeed, do not engage. But if you do? Click on Like, Subscribe, Follow, Recommend, Tweet, Blog, Link, and leave a *nice* comment. Algorithms are waiting for you to tell them what you want put out there more. Negative or hateful creations are usually put out to an echo chamber of support, desperate to be correct, and this is how that kind of material succeeds in both reach and monetization.

On the other side of that coin, can you think of art or media that gets you more riled up, one way or the other, than abuse or harassment? When someone makes a video (I’m most familiar with YouTube as a platform) that directs their followers — either directly or discreetly — to harass or abuse a person, these followers engage en masse with the abusive creator’s work. The following of an abusive creator is typically very active and loud (and also abusive) due to a persecution complex of some sort (more often than not they are men who naively feel their masculinity is threatened over some sentence someone else said). They also mass engage with the target of the abuse (dogpiling, to those in the know), which algorithmically sounds pretty good for them as it will help their content, right? Except people are not content. To put it as mildly as I can, most folks do not like it when thousands of people threaten their safety or tell them that they should commit suicide (among other heinous statements) based on their race, their religion, their gender, or even their innocuous decisions or opinions.

You and I, in contrast, would never agree to anything like that. We find it disgusting that a creator would directly incite this kind of thing — and that much more disgusting when they discreetly imply this is what people should do, knowing that will make it so. So we click dislike, once again thinking we’ve done our good deed for the day — but we’re contributing to the engagement on the abusive video.

This isn’t a new problem, either. See this post from August 26th, 2009 by Reddit’s then (and current) CEO:

A couple of weeks ago /r/moviecritic popped into the top ten reddits, causing quite a stir. The reddit isn’t used for new and interesting links, but rather for links to movies: sometimes old and sometimes new. Users were upset that moviecritic was taking up front-page space and started attacking the reddit by downvoting everything in sight. Users of /r/atheism had been under attacks like this for weeks. Unfortunately, attacking a reddit generates a lot of activity on that reddit and makes our algorithm think the reddit is more popular than it really is, making the problem even worse. (source)

They were calling it “activity” back then, and although vote brigading can be harassment or abuse, this probably isn’t something one would consider to be a case of, this pretty well demonstrates what I am talking about. Negative response caused the algorithms’ internal logic to say “well, this stuff is more likely to get peoples’ attention so it should be more visible.”

The “Quasi-Chicken-And-Egg” Situation I Mentioned

So how exactly is “content” getting spread around on the internet? Is it stuff we would have shared and talked about anyway, or is it stuff that the system thinks we want to talk about because people who were already interested in it expressed that interest. Or is it because the work itself is inherently negative and we all feel so good expressing our outrage about it? Or is the creations’s engagement itself abusive and users on both ends of it contributing to the engagement-focused algorithm’s saturation?

There’s lots of chickens and eggs at play here.

To use YouTube as an example again, do you remember when their algorithms depended on views rather than engagement? Here’s a hint: you do. Back when YouTube was known as “the cat video site,” getting a lot of views was what caused their algorithms to recommend you cat videos. But the meowing stopped. Views can be gamed and a system based on that can be easily gamed. YouTube wanted to implement logic that would predict what videos would get the most views to ensure videos they would make the most money from would be more prominently displayed on the site.

I completely get it. It’s hardly a mystery that YouTube, as a business, not only wants to make money but needs to. That’s not the problem. What needs solving is that negative engagement shouldn’t have a positive impact. Or any impact. Because negative content and negative engagement have such a massive impact on algorithms (though not the only factor, it’s too much of one) that the stuff people actually like does not, we’re getting closer and closer to a situation where the worst, most vile material gets the most attention.

Here’s the thing: Aside from a staunch belief that having a dislike button is ridiculous due to the disparity between what people believe it to be and what it actually is, I’m not going to sit here and say I have a better idea (YouTube may have come up with something). Some of my contemporaries also think comment sections shouldn’t exist, and I understand why, but I think that should be up to individual creators. But like I said, I don’t have a business plan that will revolutionize art and media and I’m not going to foist some ridiculous, forced thought on you to cleanly wrap up this essay. I will say this, though: right now, this is how everything works and it’s excessively, immeasurably important to the artists and creators that make everything you enjoy to click that damn like button.

my creations are made possible by your contributions to patreon.com/petercoffin — thank you!

--

--

Peter Coffin

video essayist with (Very Important Documentaries), author (Custom Reality and You), and podcaster (PACD)