Nature and/of the News Feed
Algorithms can never be ‘natural’ — but they might be more or less representative.
By now, you probably already know about Facebook’s experimentation with emotional contagion. Briefly: a team of researchers, within and without the company, used sentiment analysis to assign positive or negative emotional ‘scores’ to status updates. They then created two cohorts and reduced either the number of positive or negative posts which appeared in their respective News Feeds. After a week of ‘treatment,’ those who had seen more positive posts tended to post more positive updates; conversely, those who had been exposed to more negative posts tended to express more negativity themselves.
Lots of commentators are arguing about whether this was an ethically and/or scientifically good study. I want to talk about something else: how the News Feed works, and how it ought to.
“The experiment manipulated the extent to which people…were exposed to emotional expressions in their News Feed.”
The primary function of the News Feed is to algorithmically divert “top” posts by your friends from the firehose of Facebook activity to a central stream, surfacing some updates while sinking other beneath the waterline of attention. Through this process it makes a kind of map which registers, identifies, and points to social landmarks, things-which-exist among your friends. Like all maps, the News Feed is only a partial picture, an incomplete account, for a map as large as the territory would be just as unnavigable.
For this study, Facebook slightly altered the maps of some users, adjusting their list of landmarks to make the depicted territory seem a more or less depressing place, depending. Brian Keegan argues that this is actually standard practice, an academic implementation of the now-ubiquitous design method of A/B testing:
As users, we expect these systems to be responsive, efficient, and useful…These user experiences require diverse and iterative methods, which include A/B testing to compare users’ preferences for one design over another based on how they behave. These tests are pervasive, active, and on-going across every conceivable online and offline environment from couponing to product recommendations. Creating experiences that are “pleasing”, “intuitive”, “exciting”, “overwhelming”, or “surprising” reflects the fundamentally psychological nature of this work: every A/B test is a psych experiment.
I’ve heard the same argument from a number of designers and engineers, and it’s not inaccurate. Facebook, like many of its peers, can be characterized as what STS scholars call a disunity: there is not a single Facebook which everyone experiences uniformly, but rather many simultaneous Facebooks, algorithmically constructed on-the-fly for each person. From this perspective, the Facebook News Feed is always already manipulated: not only incomplete, but indeed intentionally so, engineered to influence users toward behaving in a particular way. As danah boyd writes:
Facebook actively alters the content you see. Most people focus on the practice of marketing, but most of what Facebook’s algorithms do involve curating content to provide you with what they think you want to see…They don’t do this for marketing reasons. They do this because they want you to want to come back to the site day after day.
The News Feed is neither natural nor neutral. But the natural/artificial distinction is less relevant than it might initially appear. Even if the Facebook News Feed is always constructed, we might demand that it be constructed differently. It may always be a fiction, but it could have a comparatively higher fidelity to fact.
The core anxiety about this study, as I see it, is not between natural state and artificial manipulation but about representation. What most users want, I suspect, is assurance that when Facebook compresses the world for them it does so with some measure of impartial accuracy across various emotional, political, and other dimensions. Any given person’s News Feed might appropriately be more or less happy, or more or less Republican, depending on the composition of their social network. Facebook’s job (I claim) is not to be natural, nor neutral, but to be representative, reflecting the non-natural, non-neutral lifeworld of the user as accurately as possible. The anxiety arises when Facebook intervenes by shifting the dimensional distribution according to its own interests, research or otherwise.
Facebook sells itself to advertisers in part on the strength of its sentiment analysis and natural-entity recognition. If Facebook, as it claims, has the tools to assess whether status updates are happy or sad, it also has the tools to make sure they are circulated in fair proportion. We might demand, potentially through law, that Facebook deploys these tools not solely in the service of its marketing department but also in the service of its users, crafting as emotionally/politically representative News Feed as possible.
The News Feed will always be an imperfect reflection of one’s social world, but reducing the freedom of Facebook to distort that reflection reduces their institutional influence commensurately. We can — and I think should — demand that if the News Feed is meant to function as a a mirror, Facebook shouldn’t be able to cavalierly reshape it into one that belongs in a funhouse.