Emily Bell on platforms: Regulation is inevitable
We talked to Emily Bell, professor in digital journalism at Columbia University and the founding director of the Tow Center for Digital Journalism, about Facebook’s algorithm changes, how she’d deal with the platform if she were running a news organisation, and why she has no time for trust as a metric for quality journalism.
This interview is a preamble to the Tow Center research report about the relationship between technology platforms and journalism, which Emily Bell will unveil in her lecture at the GEN Summit on Thursday 31 May in Lisbon, Portugal. The interview was slightly edited.
GEN: Do you think Facebook will ever provide a good environment for news or do you think its business model is inherently incompatible with news?
Emily Bell: What we have learned over the past couple of years is that the way Facebook’s targeted advertising business model works discriminates against high quality news reporting. If you like, it incentivises the ‘wrong kind of news’, as it inherently doesn’t differentiate between any type of published material. That said readers and viewers find Facebook a compelling environment, and in many countries, one of the only environments for news. So it is a business model problem and fixing that will be extremely challenging as it goes to the core design principles of Facebook.
In this article by Mathew Ingram, Jeff Jarvis is quoted as saying that we should be reinventing journalism in partnership with Google and Facebook, ‘because they’re a lot fucking smarter about it than we are’. What do you make of this? Do such partnerships have an impact on editorial independence?
I love Jeff, he is an old friend and we spar about many things. It won’t surprise him (or anyone) to hear that I think he is trolling journalists here. It is absurd and against all available evidence to say that Google and Facebook are smarter at news production or distribution than journalists. They work in a completely different way. Working in partnership with Google and Facebook remains a difficult business for serious news outlets, as it is clear these are institutions of great power and involvement in all aspects of life which have to be held to account by journalism. So if you asked most news organisations in the US if they ought to have a partnership with the Government, they would be appalled (Jeff too I suspect). Effectively partnerships with platform companies are not that different. It is not about the closeness of partnership but about how appropriate it is for the business of journalism. From our research at the Tow Center, the relationship with platforms definitely has a strong effect on editorial output and on how the public might perceive journalism.
How can news organisations make the most out of what Facebook has to offer without becoming dependent on the platform?
A number of social first news organisations have learned a tough lesson in the past couple of years, namely that you cannot build your business on someone else’s land without significant risk. There has to be a business model which relies on the strength of the audience relationship with your journalists and news brand, where people will subscribe or donate. A smart strategic move is to think about how to make smart technology decisions within a news organisation, both to integrate with third parties, but also to build your own products and relationships. Easier said than done…
If you were running a news organisation, what would your relationship with Facebook be?
Haha, good question. At arm’s length, I think. I am very proud of how my alma mater The Guardian has navigated the choppy waters of big tech by reporting important stories about the companies without fear or favour, whilst at the same time improving its own business model. I look at other role model organisations, like Kara Swisher at Recode, which has a literacy and understanding of the technological environment, but breaks story after story on the new gatekeepers too.
What are you most worried about regarding the recent Facebook algorithm change? What impact does it have on the news industry?
Algorithms will change on a very frequent basis, and to some extent, coping with this is part of the business of news strategy these days. The more worrying aspect of this particular algorithm change is what precedent it sets for Facebook to be able to institute changes that disorient whole news ecosystems without insight or transparency. Living in a dictatorship (even a benign one) means not knowing what your life will be like each day and losing the ability to effectively self-govern. Where the algorithm is effectively law, Facebook has to be more diligent about due process…as do all technology companies.
In an interview with CJR, Facebook’s Campbell Brown said, ‘What we’ve done is to down-rank clickbait and down-rank sensationalism, and we’ve given a boost to broadly trusted news sources, or what we define as informative news sources.’ How can Facebook optimise for quality journalism or ‘trust’? Is Facebook serious or is it just a public relations argument?
I think Facebook is serious. There has been a long history of internal debate at Facebook about how different types of material ought to be promoted. Unfortunately the free-for-all model, which was so effective for growth and revenue, has been disastrous for many other less profitable aspects of civic life. ‘Optimising for trust’ is not an idea I have much time for, as trust is completely the wrong metric. Optimise for reliability, consistency, original reporting, accuracy, willingness to publish corrections etc. We will continue to have noisy debates about what material is promoted where and how effectively. This is no bad thing. But we also need to be able to monitor ultimate effects of these changes, which is difficult as both people and data have not been very accessible. We need to know how the interventions tech companies make in information markets play out, not just in the west but in other markets where they can have a devastating effect.
Do platforms need to be more regulated and how? Frederic Filloux in a Monday Note post suggests one way would be to ban Facebook’s targeted marketing practices. What do you make of this? What would your suggestion be for a newly improved and regulated Facebook?
Regulation is inevitable. Banning targeted advertising is not going to happen but we do have to think about what sort of rules need to be developed in a world where we all receive messages which are personal and often confidential. In other words, when it is impossible for a citizen to know which messages they are receiving and why, we have a problem. I think there is a lot to be said about personal data protection laws and regulations helping curb the overreach of commercial companies. There is also still a case to say that platform companies can be broken up. It is not a popular view, but we should be asking why Google needs to own YouTube or why Facebook has to also own Instagram and Whatsapp.
Facebook has committed to a civil rights audit. Do you see this as more of a reputation saviour or a meaningful step towards creating a less biased environment?
There are some really useful efforts which try and contextualise how technology companies can have a negative impact on populations. Rebecca MacKinnon does this for platform and technology companies with the Ranking Digital Rights projects. I am sure there is an element of PR involved in Facebook’s efforts, but there is also an element of self-preservation. The next set of significant problems for Facebook will most likely emerge through their activities in fragile democracies or effective dictatorships. We are seeing this already in Myanmar, in the Philippines, in Sri Lanka, and other countries. Facebook needs an audit which is meaningful and helps the company make difficult decisions in markets where it has much power but little familiarity. If the company had instituted this approach for political advertising ten years ago it might have avoided a lot of the reputational damage it is seeing now. Audits however don’t make up for continual accountability through data and corporate accessibility.
If Facebook’s algorithms were to be more transparent, what would it mean for the user? Is it a fantasy or something that can actually happen?
It’s really not that hard to add a little transparency into how platforms work and this is wholly beneficial for readers and viewers. But transparency isn’t, as we know, a substitute for real accountability.
To finish on a positive note: With all the complaining about Facebook, what’s Facebook done that’s good for journalism?
The social web has done a tremendous amount for journalism. It has expanded our sources, allowed us to access information that was otherwise not available, helped spread important news quickly, it has created the habit of readership in a new generation in an entirely different way. And Facebook has been a part of that. I just think the balance of cost and benefits is not necessarily where it needs to be.