Facebook, 2016 election
Paul Ford
27419

Thanks Paul. This was great and –– and this still means something –– true.

History is either repeating itself or we are in a new world with serious echoes. One thing the past has shown us is (then) new media gets regulated when it grows to a scale that threatens democracy and the people who run it. So, with FB, it’s not so much a question as to whether it will be regulated, but when and by what standard. That is why much of FB’s behaviour, certainly from a PR POV, resembles a cable company even if its revenues are largely derived from activities that are, really really, those of a media company.

That fear is existential for the team out in Palo Alto and, if that Gizmodo piece is accurate, the fear of political retribution is what kept them from using their hands. But the impartiality the algorithm suggested is failing, driven by human decisions in aggregate if not directly, and of about as much consolation as being handed a beloved-and-flattened household pet by a man whose Tesla did the deed on autopilot. People of every political persuasion simply do not like finding out that they have been lied to, especially by the same thing that has photos of the kids.

Far more worrying than being lied to by third parties is being manipulated first hand. Per FB’s own research, the feed lowers our emotions to an angry hum and I would argue that antipathy is far more destructive for our democracy –– or just ourselves –– than the globalisation of the dodgiest tabloids. That’s the innovation. (I did write about it, and some solutions to the problem, here.)