So Facebook is a media company after all….

Leighton Andrews
5 min readJan 12, 2017

--

Leighton Andrews.

On Wednesday, Facebook announced its journalism project. Coming on the back of its pre-Christmas announcement that it was heading into the market for original and licensed video content, it seems the days of resisting categorisation as a media company are numbered. These moves certainly seem to bring the old ‘we’re a tech company’ defence into question. Indeed, even Mark Zuckerberg appears to have accepted this, when in a pre-Christmas interview he said:

Facebook is a new kind of platform. It’s not a traditional technology company. It’s not a traditional media company. You know, we build technology and we feel responsible for how it’s used.

The recent investigations of Facebook and Google by Carole Cadwalladr for The Observer, following on from the exposure of fake news sites on Facebook by Buzzfeed and others, have brought the mediating role of these technological giants to the surface. They have a dominant position in respect of online advertising, as Emily Bell pointed out last year. More recent analysis suggest that this is growing. Advertisers have become more and more vocal about Facebook and its occasional revision of its advertising metrics. Facebook and Google are also of course the main online news sources for many, as the Pew Research Center has consistently said.

Recently, it has been argued that media companies do five things –create, post, curate, distribute and monetise content — but that because Facebook doesn’t create, it isn’t a media company. On that score, then Channel 4, under its original publisher-broadcaster model, wouldn’t have been a media company either.

But Facebook has been in the content business for a while in fact. In December Facebook offered me a cleverly-edited video of my activity on their site during 2016, with original graphics which it had created. How is that not making media? Facebook itself, or the algorithm that powers its newsfeed, has in fact made a series of what can only be described as ‘editorial’ decisions — examples include censoring the famous historical photograph of a naked young girl fleeing Napalm; censoring a statue of Neptune in Bologna; and blocking a breast cancer awareness campaign. Meanwhile, Facebook removes material at the request of governments around the world (to be fair, it reports on this) and allegedly has devised a censorship tool to allow it to accommodate Chinese government demands; Facebook undertakes what can only be described as curated journalism projects — and it offers an e-learning course for journalists.

The clearest indication that Facebook has transformed itself into a media company, despite protestations, is the way in which it has re-designed its newsfeed over time. Research by Buzzfeed journalists again explains that process In 2012, Facebook introduced the ‘share’ button: not long after that, fake news sites start to proliferate, noting that people share without checking, particularly on mobile devices; by 2013 Facebook followed Twitter in using hashtags to link items on specific subjects and introduced a Trending feature; and in November 2014, its CEO Mark Zuckerberg declared Facebook’s goal was:

to build the perfect personalized newspaper for every person in the world. We’re trying to personalize it and show you the stuff that’s going to be most interesting to you.

What Facebook wanted, Buzzfeed reports, was to ensure people stayed on Facebook as long as possible — and that meant their News Feed needed to be ‘interesting and relevant’ — and in practice that meant reinforcing their views, not confronting them. Increasingly, it is celebrity content, political content or material from news sites that is being shared as much as, if not more than, people’s own original posts or photographs. In 2015, Time magazine explained how Facebook’s News Feed algorithm, regularly updated, took into account thousands of factors to determine what shows up in any one user’s Feed:

How close you are to a person is an increasingly important metric, as judged by how often you like their posts, write on their Timeline, click through their photos or talk with them on Messenger, Facebook’s chat service. …The algorithm also assumes that content that has attracted a lot of engagement has wide appeal and will place it in more people’s feeds

Facebook’s Trending feature was curated by people we would ordinarily call ‘editors’ though Facebook said, in 2015, it is its algorithm alone that determines trends. Others have said that Facebook is a ‘social editor’. In May 2016 Gizmodo published a story stating that former Facebook employees had suppressed news from conservative American sites. Zuckerberg allegedly had to re-assure conservative publishers that there would be no bias against them, and human monitoring was reduced. Indeed, staff who performed an editorial curation role left the company in 2016 and subsequently had a few choice things to say about the ‘fake news’ scandal after the US Presidential election.

The implication of all this, of course, is that the Facebook algorithm reinforces ‘confirmation bias’ — we are more likely to click on and link to material which appears to confirm our own views and interests. The algorithm produced a News Feed that was ‘a petri dish for confirmation bias’ in Buzzfeed’s words — essentially an ‘echo chamber’. In fact, Facebook itself confirmed that ‘friends and family come first’ : in June 2016, Adam Mosseri, its VP, Product Management, News Feed, said:

stories in News Feed are ranked — so that people can see what they care about first, and don’t miss important stuff from their friends.

Facebook’s ambition is unashamedly commercial — as Mosseri has also said:

We believe focusing on the user experience means that over the long run more and more people will use Facebook, they’ll spend more time on it and that’ll be good for them, good for Facebook and good for publishers.

If Facebook is a media company, then it should be regulated as a media company, as I have argued on this site and elsewhere. Achieving that will require concerted action: we need a strategic alliance of concerned citizens, academic specialists, journalists, and media companies who actually want to survive in this new media world. In the European Commission, concern has recently surfaced about the information supplied by Facebook on its WhatsApp acquisition. New data protection regulation is on its way in Europe — and that could just be the start.

Leighton Andrews is Professor in Public Service Leadership and Innovation, Cardiff Business School, a former Welsh Government Minister, and a former BBC head of public affairs.

--

--

Leighton Andrews

Professor in Public Service Leadership and Innovation, Cardiff University; Ex:Minister,BBC, etc; writing on government, media, technology, politics, culture.