New social contract for social media, V2
The “Cambridge Analytica scandal,” as seen in so many headlines, is giving way to a more thoughtful — and crucial — international discussion about not only data privacy but an even bigger question: where our social development is at this point in the planet’s technological development, the part we call the Internet. Here are a few thoughts on that and, below them, links to coverage that I feel actually advances our understanding of the “scandal” and what it points to (links I’ll continue to update):
Where this moment in the Internet’s evolution is concerned, author and New York Times columnist Tom Friedman writes that we’re moving into “the second inning.” The first inning, he says, was full of promise; the one we’re moving into not so much — maybe just about the opposite.
But to continue with the metaphor, baseball games do have nine innings, and — more importantly — tech worries have been with us probably since before Socrates, who I’m sure you’ve heard worried about what the technology of writing info down would do to our memories. Consider taking a longer view of what we’re seeing. In terms of history but also in terms of our time right now.
“We need to start by pausing to reflect on how our world, reshaped by these technologies, operates differently,” author and business ethicist Dov Seidman told Friedman.
Stop and think about what? Here are some ideas:
- About how we’re not stuck here forever. All this is in process. The Internet companies are scrambling to retain our trust. I mean, Facebook CEO Mark Zuckerberg has even suggested his industry may need to be regulated in some way, he told CNN. Five years ago, 10 years ago, the focus was entirely on industry self-regulation. Now, I don’t know how, for example, electoral regulators in one country can help social media users voting in other countries (such as India, Facebook’s largest market), but this is at least a clear sign that social media companies and not just their users are thinking about the trouble they’re in right now — and what to do about it.
- About the fact that 100% negative and scary is not 100% reality. Remember that our brains have a negativity bias (see this talk by social psychologist Alison Ledgerwood).
- About the impact that exposure to endless scary headlines about tech and “tech addiction” — not just to the tech itself — has on us and our children. Are we parenting from the fight/flight/freeze part of our brains because so much incoming information says we’re all just sitting ducks? If so, how is that working?
- About how we have a choice about being victims or not — and not just by choosing to #deleteFacebook. We can choose to be a little smarter and more selective than that. Remember that information and self-awareness are power. For example, if we notice that we’ve become slaves to app notifications, we can choose what notifications are important, what time of day, under what conditions, at this point in our lives and turn the rest off, for pete’s sake. Turn that granularity into an advantage!
- Most importantly, about what’s truly powerful and protective — in our lives, families, this point in time and all time — and focus on that. It’s our humanity that connects us, moves us and protects us — see this about the internal safeguards we can help our kids cultivate and this about families’ best antidote for a developing media siege mentality.
Social institutions, not platforms
On that last point, Dov Seidman uses the term “fused world,” and not just for how such a huge swath of the planet’s population is connected now. “In the fused world, the business of business is no longer just business. The business of business is now society,” Tom Friedman quoted him as saying. Along that very line last week, The New Yorker’s Anna Wiener referred to Facebook as “a social institution,” not a software company or a platform or a media company or even a hybrid of those three things. It’s even more than that: It’s a planet-wide social institution run by a single corporation based in just one of the many, many societies worldwide where it has tremendous social impact. And of course Facebook isn’t the only Internet company that has become a global social institution.
The companies have to see themselves as such. And WE have to see them as such. Just for starters. That’s the reason why they have to regain, or just gain, the public’s trust. If we want something positive to emerge from this 2nd inning, we can’t reflexively throw old “solutions” at a completely new set of conditions. We have to get thinking, all stakeholders together — including younger ones, as smart US students have demonstrated — and that depends on agency, not messages of fear and victimization. It requires working together on a problem this planet hasn’t faced before: dealing with, maybe regulating, social institutions that cross all borders and involve all cultures, political systems and value systems — the social contract of a networked world.
Continuing to update these…
- Of regulation (July 2018): After explaining why “our regulatory tools are growing so useless,” University of Toronto law professor Gillian Hadfield calls for “super-regulation,” writing in Quartz.com that “it’s ‘super’ because it elevates governments out of the business of legislating the ground-level details and into the business of making sure that a new competitive market of private regulators operates in the public interest.”
- 4/12/18: A little history on the business model, as told by Mark Pritchard, who, as Procter & Gamble’s chief marketing officer, “runs the world’s largest advertising budget,” according to John Battelle as he introduced Pritchard on the stage at NewCo Shift Forum 2018. “With Google, YouTube, Facebook and Twitter,” Pritchard said, “we along with many others essentially created the entire digital media ecosystem, because it didn’t exist 10 years ago…” with Battelle adding, “essentially helping them invent their business model….” “Yes, they were platforms for communication with people. They had no advertising business,” Pritchard added. “And what’s interesting about that is, they didn’t build these platforms for advertising, so some of the challenges they’ve had recently, I think, have been because they were built for another purpose.”
- In an interview Mark Zuckerberg gave Ezra Klein, Vox founder and editor-at-large 4/2, Klein wrote: “There is no way to track, or even understand, all that is happening on Facebook at any given time. Problems that look small in the moment — like organized disinformation campaigns mounted by Russia — reveal themselves, in retrospect, to be massive, possibly even world-changing, events.” The article in Vox: “Mark Zuckerberg on Facebook’s hardest year, and what comes next”
- “Social media platform changes March 2018”: WGBH (Boston)’s social media director Tory Starr has been tracking changes SM providers (not just Facebook) have been making around user data protections. She posted this roundup for WGBH reporters and the public on April 1.
- The big picture from the New York Times’s editors: In a 4/1 editorial, "Facebook is not the problem. Lax privacy laws are.”
- From someone who’s been there: We keep hearing about ex-Facebook and -Google employees’ negative views of their past work. So this is different: TheVerge.com's Casey Newton recently asked one of them, Justin Rosenstein, who developed Facebook’s Like button, ”Is social media good or bad?” Rosenstein responded, “I think it’s good for the world on balance. The question is so complicated, it’s hard to compute. But the good parts of social media have become so taken for granted that we’ve stopped praising them. And the bad parts, people are starting to see for the first time. So people are like, ‘Oh, it’s all terrible.’ That’s a very unbalanced perspective.” He offered up the #metoo movement as an example — “a hugely important, civilization-level conversation — millions of people in a week.” And he pointed to Jared Cohen at the State Department saying that “Facebook’s mere existence in its first five years did more to help with relations between Arabs and Israelis than 30 years of coordinated attempts by the CIA. This really basic stuff you get from connectivity is so powerful…but people just take that for granted at this point.” Rosenstein also pointed to many problems with social media, but said, “I’m hopeful. I think these are all fixable problems. You look at industries like tobacco. The difference between this and tobacco: no matter how you package that product, it’s harmful. Whereas social media, if done the right way, if we have a commitment to making sure the content we’re showing people is relevant to them, if we’re only sending notifications when something is actually timely and important, the potential is for the pie chart to move very much in the positive direction.”
- Aleksandr Kogan’s exploit not unique: “The outrage now directed at Cambridge Analytica and Facebook suggests there might be an appetite for an online ecosystem based on a different compact between consumers, platforms and advertisers. But we won’t build that ecosystem by pretending that this is a matter of a few bad actors,” writes Alexandra Samuel, PhD, in "The shady data-gathering tactics used by Cambridge Analytica were an open secret to online marketers. I know, because I was one” at TheVerge.com. “It’s time for us to face up to what online marketers and researchers have known for more than a decade: the contemporary Internet runs on the exploitation of user data, and that fact won’t change until consumers, regulators and businesses commit to a radically different model.”
- Cambridge Analytica’s exploits pale next to its parent’s: Its work in the U.S. is a subset of that of its U.K.-based parent company, SCL Elections, which Prof. David Carroll at The New School in New York calls a global “black-ops” military contractor. SCL has worked on “more than 100 election campaigns in over 30 countries spanning five continents,” Quartz reports. Carroll is suing Cambridge Analytica in Britain for failing to send him, on his request, all the data they have on him as required by British law, Columbia Journalism Review reports. The company had only sent him “about a dozen” data points (e.g., birth date, gender, zip code, etc.) while publicly claiming to have 4,000–5,000 on each voter in their system. Carroll also told CJR, “Facebook isn’t the only source for this data — commercial entities like Acxiom, Experian, comScore and so on are also involved.”
- Further context: Motherboard reports, “If you’re mad about Cambridge Analytica taking data from Facebook you should be absolutely livid about AT&T and Verizon.”
- As for advertisers: In Ad Age, “What’s next for Facebook, and it’s advertisers?”, a commentary, and later: "Facebook consolidates scattered privacy settings menu”
- Disagreeing with Mark Zuckerberg: “Don't regulate Facebook,” by Donald Graham, former publisher of the Washington Post, in the Washington Post (thanks to Stephen Balkam at the Family Online Safety Institute for pointing this out)
- More on the regulation question (and other issues) from journalism professor and author Jeff Jarvis, including Facebook’s obligations, users’ free will and this about today’s youth: “This is an articulate generation. The collection of Facebook, Twitter, Instagram, YouTube, and Snap did not ruin them. It empowered them. It connected them. It taught them how to speak to a public. In these dark, divided…times when even an optimist such as myself could start to lose hope, I have regained my optimism watching, listening to, and following these young people.” Me too.
- That insightful New Yorker piece I mentioned above, by Anna Wiener, Nathan Heller, Adrian Chen and Andrew Marantz: not to be missed
- Multiple perspectives in my post last month: "Real news: UK lawmakers’ formal ‘fake news’ hearing in the US”
This piece was originally posted in Anne Collier’s blog at NetFamilyNews.org.