Dear Facebook: Just let third-party developers tweak the newsfeed

I keep reading about Facebook getting blamed for fake news trending in people’s newsfeeds, especially after it fired its human curators.

My solution is simple: just open up the newsfeed algorithm to third-party developers. I don’t mean “show everyone what the algorithm is”, that would be disastrous. Every publisher would try to game the system more than they already do. Instead, let third-party developers take the raw newsfeed and tweak it.

Think of it like an app-market. You can activate or install a ‘friends only’ algorithm app, which only shows you pictures your friends are tagged in. You could also activate a ‘trusted news’ app, which only shows you news from sites on your whitelist, or from a community-curated list of reliable news sites. You could mix the two apps 50/50, or 60/40 or 90/10 to get the newsfeed you want out if it.

Deciding our own moods

There’s this idea that we create echo chambers, especially on Facebook, where we’re only exposed to ideas we agree with, but the reality is more pernicious: we’re only exposed to ideas we interact with.

Clicking or commenting on a news story isn’t the same as agreeing with it, endorsing it, or wanting to see more of it. In fact, since anger spreads best on social media, the reason the algorithm is delivering that content to us in the first place could be specifically to anger us enough that we share or click. The algorithm cares about what we do, not how it makes us feel.

But we care. Long-term exposure to algorithms optimized for anger have a noticeable effect: we’ve become more upset and more negative.

The business case for positive emotion

Sure, emotions such as joy don’t spread as quickly, but they produce more long-term satisfaction. Have you ever heard of someone quitting Facebook because they were overwhelmed with the happiness in the world?

Not saying we should censor out all the negative news, but let the users make an informed decision about what they’re exposed to and what their long-term moods are. People won’t be frustrated and leave for other networks if they’re able to control how your network makes them feel.

Censorship

In addition, as many of us know, Facebook has a set of community standards. This is to keep a certain feel to the site, but has the unfortunate effect of being arbitrary, culturally-defined, and almost impossible to change when it misfires (unless you create a social media storm).

With third-party development, you could choose what level of censorship you received in your feed, similar to Google’s filtered vs unfiltered result feature, leaving Facebook off the hook and less open to legal action.

In Conclusion

There are plenty of benefits, and the algorithm apps could be incentivized based on use or rating or just to offer people different alternatives to try out. What do you say, Zuck?