Fear, Elections, and the Filter Bubble
As this US presidential election officially gets underway, with all candidates nominated and anointed, I want to invite you to very, very thoughtful and prudent and cautious about what you share on Facebook, Twitter, Google, and all your social media.
Let me tell you why.
I would gladly bet you any amount of money that, right now, there are thousands of computer algorithms happily munching their way, Pac-man like, through everything about the 2016 elections being posted on Facebook. Every like/reaction tracked, every comment dissected and analyzed by an eager little machine learning model.
These pieces of software care nothing for civility, accuracy, thoughtfulness, or any of the hallmarks of an informed electorate in a civil democracy. They care about what gets shared — what causes reactions — where people get the information they share — how this information spreads — who it eventually reaches — and how fast.
These robots and spiders and data miners will process more and more information on the 2016 election, and then they will be set on the next task: predicting what people will want to see… depending on what they have already seen, who they know, and what they shared.
So — whatever it is you believe now — whatever gets you sharing, liking, arguing — Facebook will feed you more of the same. And only more of the same. You will see less and less of that which upsets you, disturbs you, makes you sign off in frustration. You will see more and more of what validates you, makes you feel important, makes you feel wise and worldly, and keeps you swiping, liking, clicking.
The business model of social media depends on convincing you that you are right about everything, and that everyone you know agrees with you.
Facebook (and Twitter, and Google) has been doing this for years. It’s not new. But it’s getting worse. Inform yourself about “filter bubbles”. These bubbles get stronger as machine learning algorithms get better. I’ve been using machine learning professionally for 4 years, and let me tell you: filtering algorithms are getting scary good.
Brexit was a warning: the West has transitioned to a post-information democracy, “post-truth politics”. Now, more than ever, we live in democracies filled with people who — thanks in large part to filter bubbles — think they know things, but really don’t.
And when you think you know something (but really don’t), and those not-facts that you cling to tell a scary story, and no one in your media environment corrects you — well, that’s not going to end well. As the UK just learned, a scared low-information voter is a very dangerous thing. He or she cannot be reached, cannot be engaged with, cannot be persuaded. Filter bubbles didn’t cause Brexit, but I’m convinced filter bubbles made it easier for Brexit to happen.
What does that have to do with the Presidential election?
We are about to find out what happens with the American voter, similarly frightened and locked behind his or her filter bubbles, goes to the ballot box.
What will happen once these little robots finish their work, and once their masters at Facebook and Google start selling their findings to the DNC and the RNC and the cable news channels and marketers? If the current climate online climate is any indication — they will report, and the political parties will hear, that based in what we see in social media, the American people are at their most energized, their most engaged, their most participatory, when they are scared.
And just like Facebook and Google, the political parties will give us more of what benefits them. Already we see the frameworks being placed, the scaffolding of fear. “Only Trump can stop the Muslims!” “Only Hillary can stop Trump!” “We can’t trust him with the nuclear codes!” “We can’t trust her to keep us safe!”. If that gets you to vote, if that gets you to watch ads, and volunteer, and donate — then that’s the message, whole and entire.
Who, exactly, can possibly be reached by this browbeating? What room is left for reasoned debate?
Is that what American democracy will be reduced to? Two people out to make boogeymen of each other? To be “the only one who can” protect you, soothe you, validate you? I think it will be.
When the political discourse of an entire nation’s citizenry gets reduced to “Other scary! Other bad!” — because that’s what gets clicks, eyeballs, likes, ad money, and campaign contributions — democracy as we imagine it has ceased to exist. We become a Balkanized collection of tribes, wall-builders, name-callers, incapable of consensus because we are armed with different collections of “facts” and are constantly arraigning each other’s motives. That’s not how you keep and build a country.
Democracies are built by an electorate who is motivated, persuaded, and informed.
Fear motivates, but it does not persuade.
Propaganda persuades, but it does not inform.
Facts inform, but they do not motivate.
I encourage you to take positive, purposeful action to break out of your filter bubble. Talk to people you don’t always reach. Listen to people you’ve grown used to ignoring. Inform yourself before informing others.
I encourage you to desist from inciting fear in others, no matter how dire the future looks in your filter-bubble crystal ball. Seek to motivate, persuade, and inform — all at once, or not at all.
And lastly, I encourage you to examine your reasons for engaging in social media politics. Do you really want to get people to vote for your candidate or do you want to feel smug about your own choices?
Do you know — really know — what it would take to change hearts and minds? And do you know what it would take to change yours?