Transparency, liability and opt-outs: social media has bigger problems to fix than just Facebook

Alex Lane
Five by five
Published in
6 min readApr 17, 2018

5x5 The Cambridge Analytica revelations have thrust Facebook into the spotlight of politicians and regulators, but Facebook’s really just the poster child for a much wider problem for social media and smartphone culture.

As villain-du-jour and android-impersonator Mark Zuckerberg appeared before US House and Senate members, the latter showcasing little but their ignorance, I thought about the changes I’d make to limit the power of Facebook, Google, Twitter and whoever may succeed them.

British politicians have already been shown our post-Brexit place in the pecking order, when Zuckerberg promised to send a minion to answer questions on Facebook’s role in our ongoing disastrous divorce from Europe. Imagine if we were a functioning member of an economic bloc which could exert genuine power over global corporations?

See here are five ideas I’d like to spitball — they’re not solutions, just thoughts that need a lot more work.

Humans minds are ripe for hacking and it happens all the time (designhacks.co)

1 Stop hacking our brains. Social networks, apps and even news providers all hijack dopamine hits and many other psychological heuristics to keep us reading, commenting and otherwise engaged. They’re hacking our minds.

There’s really very little functional difference between a successful social networking app and an addictive gambling app. Both of them abuse the systems our brains evolved to manage life in pre-industrial, pre-technological societies. We’re smart apes, but we’re not as smart as the things we’ve made to entertain ourselves.

I don’t know how you’d regulate this abuse. It starts with acknowledging that free will is at best a fragile privilege, at worst a myth. So much of our law is based on the notion that we’re rational individuals with conscious control over our choices, and there’s so much evidence that this often isn’t true. Maybe my next idea would help.

Obligatory xkcd

2 Open up the algorithms’ black boxes. The brain hacking is conducted through complex algorithms which are the most closely guarded secrets of the search and social network giants. When Facebook and Google were mere multimillion dollar international startups, then it was OK for them to claim their algorithms were too commercially sensitive to be investigated and regulated. I believe that their social, political and economic power now over-rides commercial needs.

Search and social networking experts treat these algorithms like the ‘black box’ model of the mind used by 20th Century behavioural psychologists: you don’t know what happens inside, so the only way to experiment is to change something and see how it reacts. The difference is that these algorithms are being constantly redesigned — often by semi-autonomous near-AI computers — and the networks themselves admit they don’t entirely understand how their algorithms work, with the implicit admission that they’re happy so long as they get the results they want.

This isn’t good enough. As we bumble towards increasingly autonomous AIs, I can’t help thinking this complacency is a really bad idea. Regulators need to get ahead of the problem.

It’s not an easy challenge, but governments and their regulators need to open the black boxes and look inside. They can do this confidentially, just as they do when investigating competition and market abuses. Services which won’t play can face legal or financial penalties.

3 Greater corporate responsibility. Brighter minds than mine, such as Iain Banks and Charles Stross, have pointed out that modern corporations are human-powered AIs which strip people of their moral and personal responsibility to make use of their mental resources (there’s a reason that ‘computers’ are named after the people who used to do mathematical legwork before the Information Age).

Most of us have seen, or even taken part, in ethically dubious decisions justified for ‘business’, which is a way of saying: “My membership of the corporate body absolves me of personal ethics and contradicting this decision would constitute an act of employee misconduct.” When things go wrong, as they inevitably do, the corporate body is terribly surprised and bolts on rules about corporate responsibility that are about as effective as a Band Aid on an amputation. It also explains why the most ethically challenged people rise to the top of corporations, and become the leaders and role models for corporate society.

The problem is that if corporations are people, they’re not adults, they’re children; all ego and no impulse control. In a world where corporations have the rights of people, CEOs, chairmen, and boards need to be held personally responsible for the actions of their organisations, unless they can demonstrate reasonable attempts to prevent harmful behaviour. They should also be liable to the same penalties as individuals, rather than the soft financial penalties corporations face for most misbehaviour. In a multinational context, this would devolve to the regional or national heads, with the same defense.

I’d add an additional level of statutory governance to near-monopoly services like Google, Facebook and Twitter: user boards on national or regional levels with representation at board level. Consumer and business representatives should have a direct voice in organisations such an enormous reach — and the idea could be valid for other powerful supranational corporates like Sky, Apple or Amazon.

The endgame of capitalism is not “build a vibrant and diverse marketplace” (William Warby/Flickr)

4 Break up the monopolies. Corporations often expand by acquisition, but the likes of Google and Facebook have taken it to another level, gobbling up potential competitors and parallel markets. It’s the dream of every startup entrepreneur and venture capitalist, but while it’s OK for Facebook to have Messenger, it’s simply unhealthy for it to own Instagram and Whatsapp.

It’s often bad for the end-user, and it’s frequently bad for the company that’s acquired. When the buyer has stripped out the technology it wants and swallowed the user base, there’s no incentive to innovate or develop the core product and the new owner often gets bored or distracted by another shiny new toy. Has Oculus done anything interesting since it was swallowed by Facebook? Has! Yahoo! ever! improved! anything! that! it’s! acquired!? No!
It’s a myth that capitalism encourages competition — all the evidence says it compels companies to seek a monopoly by any means necessary.

Governments need to be far less willing to let corporations build these empires instead of encouraging competitive markets that generate employment, innovation and taxation.

5 Compulsory advertising and tracking opt-outs. It’s become a cliche that for any social network, users are not the client, they’re the product. Our data is packaged and sold to advertisers. Even businesses don’t pay for their pages, although the game is rigged to make them pay to reach the people they want. This “you’re the product” hypothesis stands on two legs: networks don’t have the scale they need to make charging users profitable; and people wouldn’t pay for the service.

I’d argue first that many people would find Facebook indispensable, even though the benefits it brings are nebulous. Second, Facebook’s Q4 2017 results report more than 2.13 billion active monthly users and 1.4 billion daily active users, and in Q2 of 2017 it reported $9.16 billion quarterly revenues. Not annual, quarterly. So imagine if you offered those active users the chance to opt-out of advertising — out of being the product — for $5 a month. And then only 25% take the offer. That’s $5.25 billion per quarter.

Naturally, it would reduce Facebook’s value as an advertising platform, but in this scenario I don’t care whether Facebook has to choose to cannibalise its advertising income: it’s compulsory for them to offer an ad-free paid service. Users get to choose whether they’re the product or the service.

I’m not an economist and I have an undergraduate-level understanding of psychology, but I have spent more than a decade reporting on a regulated consumer-technology industry.

--

--

Alex Lane
Five by five

I write what I want to, when I want to. If you’re interested in the novels I’m writing, take a look at www.alexanderlane.co.uk