Turkey, algorithmic accountability, Cambridge Analytica and the risks in covering white supremacists
Our personal weekly selection about journalism and innovation. Join the conversation on Facebook and Twitter.
Published in
5 min readMar 23, 2018
edited by Marco Nurra
🔔 Pleased to announce our 700+ confirmed #ijf18 speakers. All festivalsessions are free entry for all attendees. Come and join us!
- Free speech in Turkey dealt fresh blow with sale of independent media outlets. Doğan Holding, one of Turkey’s largest conglomerates, announced that it would sell its media assets to the pro-government Demirören Holding, according to news reports. Two of Turkey’s top-selling newspapers, Hürriyet and Posta would be included in the sale. “By this huge takeover including Hürriyet, Turkish mass media industry comes under the direct political control of President Erdoğan,” said journalist Kadri Gursel on Twitter after the sale was announced. After Demirören bought the dailies Milliyet and Vatan from Doğan in 2011, the papers’ editorial lines became strictly pro-government, The New York Times reported.
🔔We’ll tackle this topic at #ijf18 with Yavuz Baydar, Marta Ottaviani, Sarphan Uzunoglu, Servet Yanatma, Stefan Candea, Zeynep Sentek and Craig Shaw
🔴 Searching for online media business models in Turkey
🔴 How can journalism networks help investigations under authoritarian regimes? Case study: Turkey - Censorship, surveillance, and harassment: China cracks down on critics. Hours after the Chinese Communist Party proposed a constitutional change last month to lift presidential term limits, any words or phrases that remotely suggested President Xi Jingping was seeking a life term were blocked from social media. Also, the word ‘disagree’ (不同意 in Mandarin) disappeared from Sina Weibo, China’s Twitter equivalent, in late February. Since the measure was first announced, when trying to post any tweet containing the word ‘disagree,’ Weibo users received a system error message saying the message was illegal.
🔔We’ll tackle this topic at #ijf18 with Jianli Yang
🔴 An unpublished story of repressions worldwide - Holding algorithms (and the people behind them) accountable is still tricky, but doable. “As reporters, we really need to push to hold people accountable. Don’t let corporations say ‘it was the algorithm,’” Nick Diakopoulos said. “You need to push on that more and find out where the people are in this system.”
- Julia Angwin’s Twitter thread on Facebook-Cambridge Analytica. “In light of Facebook-Cambridge Analytica kerfuffle, it’s worth reviewing Facebook’s long history of lax enforcement against apps and partners who steal data from its users. In 2010, I launched a series at the WSJ called WhatTheyKnow. Two of our blockbuster stories were about third parties using Facebook data without user knowledge. First, Emily Steel found a company called Rapleaf that was scraping data from Facebook members and using it to target political ads. Then Emily Steel and Geoffrey Fowler found that Facebook was leaking identifying information about its users — names and in some cases, friends names — third party apps such as FarmVille. Facebook blamed the leakage of identifiers on the Web browser makers, but said that it had ‘zero tolerance for data brokers’ and would force Rapleaf to delete all the information it had in its possession. But we now know that Facebook wasn’t too worried about those breaches.”
🔔 Julia Angwin will be a #ijf18 speaker at the following events:
🔴 Holding algorithms accountable
🔴 Can journalism hold platforms accountable? - Cambridge Analytica’s Facebook data abuse shouldn’t get credit for Trump. Misuse of data is bad, but some coverage of Cambridge Analytica suggests that knowing what someone liked on Facebook is enough leverage to transform elections. But is it really possible to psychologically manipulate voters via Facebook?
🔔 We will discuss this with David Stillwell, the co-author of the academic study on which the model that Cambridge Analytica adopted for the Trump campaign is based
🔴 Is it possible to psychologically manipulate voters via Facebook? - Mark Zuckerberg speaks to CNN: The highlights. “I’m really sorry that this has happened” and that he is “happy to” testify before Congress.
- Facebook’s Campbell Brown: ‘We’ve been caught flat-footed’ on data scandal. “We are in a position now where we have to be judged by our actions,” said Campbell Brown, Facebook’s head of news partnerships. “We’ve been caught flat-footed,” Brown continued, saying Facebook was “focused too much on the positive” and not the negative aspects of the platform. “There is an awakening that is taking place inside the company where the mentality is ‘all hands on deck’ to address this.”
🔔 Campbell Brown will be a #ijf18 speaker
🔴 Quality news on Facebook: a conversation with Campbell Brown - “Perspective, please,” writes Jeff Jarvis. “Note well that Facebook created mechanisms to benefit all campaigns, including Barack Obama’s. At the time, this was generally thought to be a good: using a social platform to enable civic participation. What went wrong in the meantime was (1) a researcher broke Facebook’s rules and shared data intended for research with his own company and then with Cambridge Analytica and (2) Donald Trump.”
🔔 Jeff Jarvis will be a #ijf18 speaker
🔴 Moral panic over technology: is it all that bad? - Journalists covering white supremacists must weigh risks to selves and families. White supremacist movements have always been a force in American political life. But when a number of media-savvy, well-organized leaders of these groups explicitly embraced Donald Trump during the 2016 election, newsrooms began assigning more reporters to the story. The beat took on an added urgency last year, after a man taking part in a protest over the removal of a Confederate statue in Charlottesville, Virginia, drove his car through a crowd of counter protesters, killing a young woman. “It’s become more necessary to have reporters trained to be able to cover this movement.”
🔔 We’ll tackle this topic at #ijf18 with David Neiwert, journalist and author and acknowledged expert in American right-wing extremism
🔴 Alt-America: the rise of the radical right in the age of Trump - That ‘fake news’ term. “I have found myself using the term Fake News more than any other word or phrase to describe what I, and many others, are fighting against in this war against untruth, despite the fact that I agree with everyone who says we need to do away with the term. Almost ironically, it is because of the discussion around the Fake News term and how bad it is, that I have ended up using it so much. It ends up being a pretty good colloquial shorthand for ‘all that stuff which is problematic in the online information space’. But still not great.”
🔔 Shane Greenup will be a #ijf18 speaker
🔴 Fake news: you’re making it worse - What’s still missing from the study of fake news? (A whole lot.) A big new report from the Hewlett Foundation pulls together existing research on social media, political polarization, and disinformation to show where we still need to know more.
- Google announces a $300M ‘Google News Initiative’ (though this isn’t about giving out grants directly to newsrooms, like it does in Europe). Google said Tuesday it’s committing $300 million over three years towards various products and initiatives intended to help news publishers and sweeten Google’s relationships with them. Poynter receives $3 million to lead program teaching teens to tell fact from fiction online.
🔔 We’ll tackle this topic at #ijf18 with Madhav Chinnappa, Head of Strategic Relations, News & Publishers of Google
🔴 Google News Initiative: a conversation about news with Madhav Chinnappa - Three examples of machine learning in the newsroom. What experts have to say about the use of machine learning in the newsroom, and what data journalists can learn from it.