Algorithmic Recommendation System and Memetic Cultural Production

Helena Sandberg
In Search of Search (& its Engines)
7 min readApr 1, 2021
Screencap of Youtube search results for “coffin dance” meme

Dr. Gavin Feller is a postdoctoral research fellow in digital humanities at Umeå University with a background in critical-cultural studies and media theory. Much of his research has focused on the intersection of media, religion, and culture, and his current work explores digital media platform infrastructures with a focus on children and families.

In January 2021 Gavin visited the “In Search of Search” research theme at the Pufendorf Institute for Advanced Studies and presented his newest work on Youtube’s algorithmic recommendation system as a form of memetic cultural production. Shortly thereafter we had the chance to make this interview with him.

Gavin is interested in memes: a form of cultural content often shared in social media, combining images and text. Memes are highly intertextual, borrowing or imitating elements from other cultural formats. Memes can be considered a form of modern folklore.

Helena Sandberg (HS): You are doing research on memes. In what way do memes matter in our contemporary culture?

Gavin Feller (GF): Memes are a great entry point into contemporary digital culture because many of us engage with them on a daily basis. They have many layers of cultural meaning, like an inside joke that’s spread all over the world and changed along the way. They also serve a wide range of cultural functions, from pointed political critique to ‘digital nonsense’. Memes can be used to promote social change or just to have fun. For me, memes are also part of the bigger question of what happens when institutions — with their organizational strategies, ideologies, and values — attempt to appropriate popular culture.

Gavin was first drawn to memes through religion. In collaboration with a former colleague he researched how religious institutions appropriate meme culture analyzed the sublime and aesthetic sides of memes. This led him to explore memes on YouTube. Across these projects, his main goal is to try and understand the role technical systems play in the circulation of internet memes.

HS: Could you tell us what your research at the Umeå Humlab is about more specifically?

GF: Well, we often think of memes as having a life of their own, that they circulate organically and spontaneously throughout digital platforms. But, anyone who’s studied media infrastructure knows that social media platforms and other places where memes spread have unique affordances and limitations that impact how different content is created and circulated.

YouTube, as the largest video platform in the world, is a fascinating case study for exploring how video memes spread. The site has bragged that its recommendation algorithm is responsible for 70% of the time users spend on the platform. That’s considerable influence. Many of us know the experience of being sucked down the YouTube ‘rabbit hole.’ And those of us with kids know that there’s no telling where a toddler will end up if you hand them a tablet with YouTube and let the up-next algorithm auto-play videos for an hour!

For now, I’m interested in how this recommendation system understands memetic content. That is, how does it identify, categorize, and then recommend video memes? That’s a big question that calls for an interdisciplinary approach to both the technical platform and the cultural practices of memes. I’m hopeful that what I’m doing is starting to answer that question in some way.

HS: How does the research relate to search engines then? In what ways are recommendation systems similar to or different from search engines?

GF: YouTube’s recommendation system is very similar to a search engine. To flip it around, what search engines do is recommend content based on a user’s input (e.g., keywords, browser history). A recommendation system like YouTube’s does the same thing but it assumes a less active user. In a way, recommendation systems are designed, like so many contemporary technologies, to make our lives simpler and more convenient by sorting through incredible amounts of information and then saying, “here’s what you want!” So, recommendation systems and search engines are both sorting devices and thereby instruments of power. They make judgments about what is relevant and useful to a given person.

Search engines differ from recommendation systems in the way that the former assume a more active user that is not sitting on the couch hoping YouTube will play something good automatically, but rather someone who is manually entering keywords in search of something more specific. There are many different kinds of search engines and recommendation algorithms that operate in a variety of contexts, but they are all designed to work out of sight. That is part of what makes them so powerful. It is also important that as scholars we do our part to help keep Goliath tech companies like YouTube/Google accountable for the ethics and social responsibilities of their systems. There’s a lot more work to be done here and we need real interdisciplinary collaboration to do it.

HS: But what did you find out? What has been the most interesting or surprising results so far?

GF: Well, we certainly didn’t crack the recommendation system and reveal all of YouTube’s best-kept proprietary secrets! We did, however, find some emerging patterns in the data that demonstrate that YouTube doesn’t treat all meme videos the same. We found that in many cases the recommendation system treats memes as appendages to popular, and more profitable, source texts.

HS: What does that mean? Could you give an example?

GF: So, if you watched a Game of Thrones video meme the recommendation system would push you to the official Game of Thrones/HBO YouTube channel and other Hollywood content. Of course this makes sense because those bigger channels and corporate partnerships can generate more ad revenue for YouTube. But, at a cultural level, when recommendation follows this pattern, the layers of cultural meaning embedded into a given meme are lost on the system.

It appears as if YouTube’s recommendation system can recognise certain memes not just as an appendage to a popular text but as a type of genre in themselves. For instance, instead of connecting a meme based on the popular animated TV series the Simpsons back to an official Simpsons channel, YouTube recommends more Simpson’s memes.

GF: We’re still sorting out why these two different recommendation patterns exist and what they mean for our understanding of meme culture.

One thing is clear: YouTube’s recommendation algorithm is a system that affords visibility to some videos and denies such visibility to others, making some memes easier to access and more likely to spread than others. Gavin hopes that this will push the study of meme culture away from what has been an almost singular focus on memes as texts, and encourage scholars to dig deeper into the technical infrastructures of other platforms where memes often originate, such as Reddit and 4Chan.

As a media scholar and outsider to computer science I was quite curious to learn more about the humanities lab at Umeå University and to understand more of the work behind this kind of research.

HS: If possible, could you describe how the research is carried out more in detail?

GF: As a so-called ‘black box’ technology, the guts of these machines are carefully kept hidden as proprietary intellectual property. Some scholars try to ‘reverse engineer’ an algorithm or do an ‘algorithm audit.’ Our approach is less an attempt to figure out exactly how YouTube’s recommendation system works at a technical level and more an effort to begin experimenting with ways we can see the cultural implications of its logics. That said, we’ve started with a simple robot programmed to play through the up-next videos and scape meta-data about recommended videos along the way. We then use a combination of thematic and network analyses to get a sense of what is being recommended at both a quantitive and qualitative level.

So, for instance, if you play a popular version of the ‘Coffin Dance’ meme, what kinds of channels does YouTube recommend to you? Are they channels that specialize in memes or something else? Are the other videos recommended also memes? We’re using a new twist on network analysis by visualizing the video data collected from the robot in a way that helps us see the ‘flow’ of the algorithm in order to see how video recommendation works over time.

Gavin says that there isn’t one ‘winning formula’ for studying these kind of systems. There is still a lot to learn and much of his work so far is experimental, just trying to figure out what it means to study how YouTube’s recommendation system’s treats video memes. What Gavin and his colleagues at Humlab hope to do is to bring more attention to the technical infrastructures undergirding the digital spaces where memes thrive.

HS: So, how will you take this research forward, what will be the next step? And what are the most important challenges and questions to ask as for search engines and recommendation systems, in your opinion?

GF: Moving forward, we need to continue experimenting with how to study YouTube’s recommendation algorithm. There’s much, much more to be done. Although tempting, the answer isn’t simply “more data,” even if our initial study is quite small in that sense and could certainly benefit from a larger data set.

It’s not just a quantitative problem; we need better collaboration between computer science folks who build these systems and cultural researchers. This kind of research has to combine theoretical and cultural insights with technical knowledge in order to make persuasive and useful claims. We’ve got to take into account the agency of both YouTube’s recommendation system and its users, as well as the larger political economy and cultural discourses in which each are situated.

So one major step forward is overcoming the logistical and institutional challenge of getting the right teams of scholars together to do this well. I also think it’s important that researchers put platforms like YouTube into a larger media history. Many of us watched the recent ‘Social Dilemma’ documentary in horror as former tech bros talked about today’s social media platforms as these completely unprecedented devices completely detached from previous media technologies, cultural discourses, and political economies.

Some history goes a long way when it comes to moral panics about search engines and recommendation systems. On the other hand, we need pressure on major tech companies in order to keep them accountable and a bunch of academics saying, ‘slow down, it’s complicated’ isn’t always conducive to change. Again, if we can get the historical and theoretical knowledge in the same room with the technical expertise and political activism, that’ll make a big difference.

--

--

Helena Sandberg
In Search of Search (& its Engines)

Media and Communication Scholar from Sweden interested in digital media technologies, practicies, and cultures