The lies, tricks, and chaos of social media — the role for universities
Viral half-truths are part of the fabric of today’s internet. People share fake news even when they can tell it’s not true. And the anger misinformation inspires has been turned into a dangerous commodity. What exactly is going on — and what is the role of the university in combating it?
By Paul Newton, Head of Digital and Media at Keele University. Follow me on Twitter @pnewton84.
Satire and fake news
In 2019, a new university made a splash in the higher education Twitter scene. Founded in 1969, it boasted of being a Top Ten university (when ordered alphabetically), and is led by its Vice-Chancellor Vince Chancelier. This university is, of course, ‘The University of Bantshire’, a parody account poking fun at the HE scene.
Satire and parody is nothing new. WW1 saw the publication of The Wipers Times magazine, and Private Eye magazine has been in circulation for almost 60 years. And today, thousands of parody Twitter and Facebook accounts exist, parodying governments, celebrities, and other segments of the population.
Parodies of higher education are popular, both online such as the 2011 Occupy MLA movement, and offline such as Julie Schumacher’s book ‘Dear Committee Members’. There is a rich tradition of the parody of academic life, often shining a light on the big challenges facing the sector through humour.
“Like great journalism and great inspirational politics” the purpose of the best satire “is not to hurt but to cure”, says the BBC’s former Director-General Mark Thompson (2016). And according to Penn State University researchers, satire performs a vital function in democratic society by using humour to broach taboo subjects, especially in times of crisis.
The 2019 UK General Election
“The world of digital advertising has changed very fast since 2016. This is partly why so many journalists wrongly looked at things like Corbyn’s Facebook stats and thought Labour was doing better than us [The Conservative Party] — the ecosystem evolves rapidly while political journalists are still behind the 2016 tech”, wrote Dominic Cummings in his almost 3,000-words long job advert at the start of 2020.
Ahhh yes, 2016, the year before ‘Fake News’ was named as Collins’ Word of the Year in 2017, defined as ‘false, often sensational, information disseminated under the guise of news reporting’.
Cummings is right. The game has changed — for better or worse. The 2019 UK general election saw high-level disinformation and false polling reports being shared widely online, from all political parties, through parody websites, fake newspaper mockups, and doctored videos. A dangerous mix — it can be difficult to distinguish where parody and satire ends, and fake news begins. We already know that too many people think satirical news is real. The problem is so prevalent that researchers from George Washington University are developing AI that distinguishes between satire and fake news.
But the parody posts during the election weren’t just being shared from parody or fake accounts. Official political party accounts also shared posts and videos that were almost a parody of themselves — such as the shitpostings* from the Conservatives, or the ‘Love Actually’ spoof video featuring Boris Johnson. Social media users either loved these posts, or hated them — but either way, they shared them widely.
*Shitposting is, to clarify, when someone posts something typically nonsensical, surreal, and ironic online — sometimes in order to bait people into a reaction.
But these tactics veered close to being fake news. One of the most surprising and deceitful events from the general election was when the Conservative Press Office Twitter account changed its name to ‘factcheckUK’ during a televised debate. The move was condemned as inappropriate and misleading by many news outlets and organisations, including the real fact checking organisation Full Fact.
Then there was also the widely reported “punch” of Matt Hancock’s advisor outside Leeds General, which turned out to be false. In response, the BBC could restrict journalists’ use of Twitter to break stories or give instant analysis.
Social media networks are trying to implement ways to fight fake news on their platforms. Twitter banned all political adverts in October 2019, and Facebook are flagging false claims underneath offending posts. However, the steps taken by social media platforms to limit the ways their adverts can be used to influence politics is still fairly inadequate. For example, an advert during the election designed to split the anti-Conservative vote in key battlefields was run through an organisation called ‘3rd Party limited’ (Private Eye magazine, 2019).
All of the above led to The Coalition for Reform in Political Advertising saying that at least 31 General Election campaigns from across the party spectrum were indecent, dishonest or untruthful (BBC, 2019). There is too much to cover here, and so this blog is a great roundup of what went down, as is this in depth Twitter thread by @rowlsmanthorpe.
“The unintended consequence of a network when it grows to a billion or two billion people. It literally changes your relationship with society, with each other. It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.” Sean Parker, founding president of Facebook.
So, what can universities do?
Don’t get tricked into sharing fake news
It only takes the click of a button to share something online, and people rarely stop to fully scrutinise the content they share, particularly if they agree with it.
Think SHEEP before you share!
It’s easy to get tricked into sharing, such as the sharing of this fake drink drive poster by the An Garda Síochána.
Be aware of bots
Bots are everywhere. State-owned bot armies exist. Bots are used during elections. And now fake Facebook accounts are using fake faces. But… some people pretend to be bots when they are in fact actual real humans, again to cause confusion and, well, because ‘it’s funny’.
I’ve even made a bot, and blogged about it. It’s that easy!
Be critical. Pay attention to quality and timeliness. And check the sources yourself.
Also be aware of bots when reporting on your university’s social media stats. We’ve seen some posts that have been boosted to an international audience results in tens of thousands of likes, with very little other engagement (one or two comments or shares). It’s possible that your post’s huge reach is actually just being read by a lot of fake bot accounts.
Continue to use humour to engage
Whilst all of this is terribly serious, universities shouldn’t be put off social media and should continue to use humour, maybe even the odd meme or parody, to engage. Humour in higher education marketing and communications can be a powerful thing. It demonstrates confidence, and it really resonates with an educated young audience.
As the social web becomes a noisier place, it is becoming more of a challenge for universities to connect with their audiences, find new connections, and build unique and distinguishable identities (read more in Keele Communications’ interview with insidehighered.com).
And with the current atmosphere of mistrust, it’s still absolutely essential that university marketing campaigns and communications are trustworthy. I personally think that it doesn’t help that the OfS refers to university marketing campaigns as a “necessary evil”, but they are right when they say that it is essential that universities do not mislead or over promise, and instead provide high quality information to prospective students. Be less Bantshire.
We must remember that university websites have one of the most trusted domains (.ac.uk), and we should use these domains wisely to share facts. And when it comes to running Facebook adverts, it’s advisable that the person setting up the advert is verified on Facebook Ad Manager. This will give you more scope of the adverts you can run — although you will need to send Facebook a copy of your passport or driving licence.
A lie spreads faster than the truth on social media. Why? Perhaps because “eye-popping headlines in our social media feeds make it easier for us to share content than evaluate or even read it. This creates a viral storm of sound bites without substance” (Harvard.edu, 2020). We also “evaluate stories not simply on plausibility, but on a complex mixture of past experience, knowledge of context, authority of the source, and our own beliefs” (THE, 2020).
There is no way to eliminate fake news, but we can diminish its influence. Good journalism is needed more than ever to counter rumours and fake news. 14% of the public trust politicians to tell the truth, but 86% of people trust professors and scientists to tell the truth, according to IPSOS MORI’s Veracity Index 2019. Professors have seen public trust in them improve by 16 percentage points since 1993.
Universities see themselves as repositories of trust. We must therefore combat fake news with expert comments. A good way to do this is to write for The Conversation, a non-profit news site which many universities partner with, and which syndicates articles to all major outlets on a creative commons model. Interestingly, The Conversation already has 225 articles about Fake News by academics, all free to read.
But, of course, time is tight for many academics. That is where universities’ communications teams come in — they can support in sharing expert, honest and accurate commentary across digital, broadcast, and print media. This might be from a simple Twitter thread or animated video commenting on the day’s breaking news, to filming a short video on an iPhone to share on the University’s verified Twitter account to thousands of people (and future students). People trust academics, so make your voice heard.
Don’t feed the trolls
Social media is an ideal place to promote your research. But as a researcher, you should expect your work and comments to be scrutinised by the public, policy makers, and campaigners.
Some researchers working on high-profile subjects that attract controversy have also found themselves targeted with online harassment, although the Science Media Centre advise that doing media work does not by itself increase the chance of a researcher being targeted.
There are four types of trolling:
- Astroturfing — Astroturfing is the attempt to create an impression of widespread grassroots support for a policy, individual, or product, where little such support exists. In terms of trolling, it might mean coming under fire from the combined force of an entire country, or fake bot armies. Instead of taking offence, take heart, you are poking at something that someone really doesn’t want you to!
- Sea lioning — ‘Killing by kindness’ and manufacturing ignorance by asking question, then turning on the victim in an instant — and then taking on the wronged victim role. What to say to a sea lion: “Here is a peer-reviewed, academically rigorous link explaining all the information you need”.
- Concerned trolling — They hide criticism behind a ‘genuine concern’, the classic wolf in sheep’s clothing.
- Incitement trolling — encouraging a dogpile on the victim, quoting people with whom they disagree, or sharing a screenshot with the words “share widely”. This lets lose a mob on the victim. Try and ride the storm — ignore, block, or report.
If you experience trolling, the best advice is to step back and assess the situation. Are the views representative of the wider public, and do they have significant influence? Your university’s communications team will be on hand to help. But it’s important that you don’t allow yourself to be silenced. Be proud of the research you do, and be honest and transparent, and think about what you want the public to hear.
More advice on handling Trolls is available from the Science Media Centre.
Teach students the truth
There are lots of resources readily available to share with students, to give them the critical thinking and evaluation skills to spot fake news. University professional staff, too, would benefit, to stop the university system becoming a victim of fake news.
One of the most engaging resources I found whilst writing this blog was an online browser game by the University of Cambridge called Bad News, in which people play the role of propaganda producers to help them identify real world disinformation. Players stoke anger and fear by manipulating news and social media within the simulation: deploying Twitter bots, photo-shopping evidence, and inciting conspiracy theories to attract followers — all while maintaining a ‘credibility score’ for persuasiveness (cam.ac.uk). It’s super engaging.
A final word from Dom
Back to Dominic Cumming’s wide-ranging blog post from early 2020. Dominic refers to Dr. Robert Cialdini’s six ‘Principles of Persuasion’. The third principle is the idea that people follow the lead of credible, knowledgeable experts.
Be that expert.
Share your expert opinion on social media, and The Conversation, and fight fake news!
By Paul Newton, Head of Digital and Media at Keele University. Follow me on Twitter @pnewton84.