A Small World? WhatsApp’s Big Misinformation Problem

Tactical Tech
9 min readApr 7, 2020

--

by Stephanie Hankey

A collage of Facebook’s ‘free street’ in Menlo Park, California
Collage: Facebook’s ‘free street’ in Menlo Park, California, 2020

“We just want to keep it small,” were the words I heard from the representative of the messaging service WhatsApp, now used by over 2 billion people worldwide. I was standing in Menlo Park, California, in the middle of a street lined with shops serving muffins and pizzas. But there were two strange things about this street: firstly, everything in the shops was free; and secondly, it was bustling with thousands of Facebook workers. This is what it is like to step inside the real world of Facebook — as well as Instagram and WhatsApp, which they own and with whom they share a headquarters. On one level it looked and sounded the same as the world most of us inhabit, but fundamentally nothing is quite the way the rest of us experience it.

Before entering this street I had to wait in a room wallpapered with protest posters, staffed by a man in a relaxed uniform who shouted ‘welcome in!’ in a celebratory tone to all who entered. I stood at a touch screen and scrolled through pages of small print before being prompted to sign ‘I agree’. Like getting access to a secret world, once I signed, I could enter. Despite being a critic of the big tech platforms, it was not the first time I had been allowed in to the free ‘Facebook street’, which feels like a mix between the Truman Show and The Circle, but it was the first time I had met a representative from WhatsApp. The representative, from their communications and policy department, engaged in earnest tones. We talked about the report my organisation Tactical Tech, and its partners in Brazil, Colombia, Kenya, Malaysia and India, had published on the negative use of WhatsApp in elections. In these countries, many experience WhatsApp as the internet and receive the majority of their news and election materials through it — whether real or not. It felt like a cordial walk and talk with a sensible colleague. As our half hour drew to an end, the man who worked for a company that earns over $500 million a day and has over 25,000 employees turned to me and said: “We’d love to do more, but we just don’t have the resources.”

Two years later, I wonder what that street looks like now. Deserted, like the streets of all other cities. The engineers, designers, marketeers, managers, researchers and policy people are no longer sitting on benches next to manicured green spaces while munching free snacks, but wearing their corporate hoodies in the shelter of their home offices, looking at their global community from the vantage point of their laptops. The free street has emptied out; yet the rest of us, behind our closed doors, are more reliant on WhatsApp than ever.

“The free street has emptied out; yet the rest of us, behind our closed doors, are more reliant on WhatsApp than ever.”

For most of us, coronavirus is the topic we want to avoid at dinner and the one we wake up in the morning and realise still exists. The current health crisis is bringing to the surface all the things that already barely work in our lives and societies and stretching them to breaking point, from strained domestic arrangements to overburdened credit cards, from gig economy workers to greedy landlords, and from neglected communities to under-resourced health services. The crisis emerging on WhatsApp is no different. Although WhatsApp may help us feel like we can do more with less, this moment of dependency from so many communities also amplifies its faults. With all eyes on the parent company, Facebook has downplayed WhatsApp’s significance in the public eye and not done enough to identify and fix challenges in advance. WhatsApp has certainly taken steps to ‘introduce friction’ in messaging, continuously reducing the number of recipients to whom you can easily send a heavily forwarded message (from 250 at once a few years ago down to 20, then five last year, and as of today just one at a time). This change has served to reduce the intentional and easy dissemination of malicious misinformation. However, content shared on WhatsApp is truly viral — a contact point of one person at a time still works much the same way as the pandemic itself. Yet WhatsApp has known about this problem for some time. Putting in place new measures in the middle of a crisis, rather than beforehand, suggests that they are only acting now in order to save face. The real question is: what more should they be doing, now that the spotlight is on them?

The buzzword at Facebook is ‘community’. It indicates a shared purpose: Zuckerberg’s vision of ‘connecting the world one community at a time’. As a self-appointed goal, it’s admirable. Communities are for the good times — the joy of connection, love of sharing and solidarity with others — and we have seen that more than ever in this global pandemic. We might feel closer to our own personal community of friends and family than before the lockdown. WhatsApp now brings me my brother from Australia straight into my living room by video, twice a week — I drink a coffee, he drinks a beer — we chat about politics and life, just like the old days. It’s a small world. What’s not to love? We don’t even have to know the people WhatsApp brings us to feel the sense of community. They are there in your hand as you watch a video of thousands clapping from balconies in appreciation of health workers, in a city you’ve never been to, filmed by people you don’t know, yet witnessed at the precise moment it is convenient for you to engage with the world. A mini theatre of human kindness and faith. It’s a beautiful thing.

Now we have proof: WhatsApp can facilitate feelings of community amongst its users during the good times and the bad. Yet, as with many things in Silicon Valley, ‘community as business’ is at best a metaphor. The last few years have shown that big tech companies can’t only sell us the benefits and ignore the challenges. Facebook has struggled to rein in a range of public scandals, from Facebook Live being used to broadcast a tragic massacre of 51 people in New Zealand to the thousands of people whose personal data was abused by Cambridge Analytica. They have left us wondering how a company that claims to know so much about community knows so little about people. Yet, despite the efforts of many civil society groups and journalists to highlight the impact of Facebook-owned WhatsApp on hate speech, misinformation and elections worldwide, WhatsApp has escaped the same scrutiny. The smaller sibling has sheltered under the shadow of its bigger brother. Until now WhatsApp has largely gone unscrutinised.

“WhatsApp is so powerful because it is so deeply intimate.”

WhatsApp is so powerful because it is so deeply intimate. Messages you really want to read are mixed with those you don’t. Rumours, gossip, leaks and inside jokes are pushed to you directly. The fact that these images, videos and soundbites are visibly amateur — the highlighted screenshot, the shaky mobile phone video or the badly recorded Skype call — makes them feel all the more ‘evidential’ and somehow more ‘authentic’. This is partly why misinformation on such platforms gets so much traction with an older generation who may not have developed the same level of digital scepticism as their children and grandchildren. To social media users this may all sound familiar, but WhatsApp gets to us in a different way. It has all the characteristics of something you would trust, yet none of the qualities.

The media reports daily on misinformation and fake news, but these still play on our deepest hopes, fears, biases and values. Crisis is teaching us that there are lots of types of misinformation on WhatsApp:

  • Claims: images taken from a different time or place that spread panic and uncertainty, like images of stampedes at supermarkets that were actually taken five years ago.
  • Comedy: memes, mash-ups and fake leaks, humour ranging from dry to dark to daft, like CNN reports that ‘Sex Cures Coronavirus’.
  • Conspiracies: rumours and organically growing fears, presented as truths: ‘it’s a pandemic engineered by pharmaceutical companies’ or ‘it was created by an evil mastermind to wipe out the old and the sick’.
  • Cures: fake promises of protection with home recipes, instructions and how-to guides, from blowing a hairdryer up your nose to wearing your mask inside out to stop germs coming in.
  • Cons: things you can buy that will save you from the pandemic: anti-malarial drugs, home remedies and rare supplies from companies that don’t exist.

As the crisis goes on the examples will change but the problems will remain. Some of these misinformation nuggets are harmless but others create public confusion, entrench differences and extend generational and political divides. Some may even cost lives. In places like Brazil, where WhatsApp is zero-rated — without data charges on a mobile phone — these stories are even more powerful as users can’t always verify them elsewhere. As coronavirus misinformation is reshared and embellished, its validity becomes stronger in an ever-tightening social loop: is Paracetamol better than Ibuprofen? Who knows anymore?

The elephant in the room is encryption. Because messages are encrypted, WhatsApp can’t read what people are saying, but they can still see who is sending messages to whom, from where and how often. This is why WhatsApp cannot use automation to identify and remove fake news, as Facebook and others do. The inability to read content algorithmically is now the main argument used by those who want to remove encryption for national security reasons. Yet this is misguided. Focusing on encryption as the problem threatens to remove WhatsApp’s strengths — privacy and the protection of its users from targeted advertising — without successfully countering its weaknesses.

We need to start by looking at the systemic nature of the problem. It is important to create simple checks and balances that slow down the unnecessary malicious spread of misinformation, but we need to do better than that. We cannot allow a company to talk about a platform with 2 billion users being personal and ‘small’. Equally, in looking for solutions, we cannot treat digital platforms as though they don’t mix. It cannot only be companies self-imposing solutions one platform at a time. Information moves from encrypted channel to non-encrypted channel and back again. My 70-year-old father can find a video on TikTok (yes, he uses it), forward it to my family WhatsApp group, and my mother in a different part of the country reposts it to her friends on Facebook. That’s how misinformation really moves. Platforms are just the surfaces on which our viral information sits before it is picked up and passed on. Much ado has been made about the ‘like’ button in the ‘attention economy’ but it’s time to start talking about the ‘share’ button in the ‘misinformation economy’.

“Much ado has been made about the ‘like’ button in the ‘attention economy’ but it’s time to start talking about the ‘share’ button in the ‘misinformation economy’. ”

We have to fundamentally change the way all platforms work and start holding them to account in their dual role as neutral content carriers and media providers. As a long-term battle on this issue plays out in the policy world, we need to start working on misinformation resilience in the real world.

Trusted local and global institutions working in health and community participation need to be given limitless support across platforms to distribute expert information without users being tracked. There needs to be ‘no-strings-attached’ support to the trusted independent media institutions that many governments and digital platforms have destroyed, allowing them to do their job gathering evidence and providing well-researched stories. Independent fact-checking efforts should be strengthened and made widely accessible (Agence France-Presse (AFP) Fact Check’s daily resource ‘Busting coronavirus myths’ is an education in itself). Citizens themselves need to become more resilient, and independent non-profits need to be given the resources to ensure users are digitally literate and can understand, verify and make their own choices about what they are seeing.

When it comes to WhatsApp, they need to invest in their team. They need to own their widespread impact on the world and echo this with a diverse, cross-discipline team with real-world global experience and a range of ethical, social, cultural and political expertise that works in collaboration with engineers and designers, not just on policy and PR. They need to invest in countries where they have a negative impact, but do not have offices: collaborating with and acting on the advice of local groups who know and understand their own environment and the challenges they face.

That day in the sunshine two years ago, standing on the ‘free street’ at Facebook’s headquarters, WhatsApp was already a problem waiting to happen. They chose to do the minimum possible. Now it is time to turn public scrutiny towards WhatsApp, along with all the other platforms. And it is time for Facebook to own the responsibility that ‘keeping it small’ brings with 2 billion users across the world.

Stephanie Hankey is the Executive Director of Tactical Tech, an international NGO working in public education on the impact of technology on society.

--

--

Tactical Tech

Tactical Tech is an international NGO that engages with citizens and civil-society organisations to explore and mitigate the impacts of technology on society.