Hey, Let’s Kill Facebook!
We always knew cigarettes were bad for us, and we kept smoking anyway. But then we found out tobacco companies were intentionally engineering their products to be more addictive…
“Facebook is the Big Tobacco of emotions.” That’s a thing I just kinda blurted out on a podcast recently because I believe it and it fit with what we were talking about and, most importantly, it sounded cool as shit.
So, imagine my disappointment when I Googled it later and found that I wasn’t the first person to put Facebook’s addiction-based business model in those same terms. If nothing else, I was beat to that point by Salesforce CEO Marc Benioff, who said this while talking to CNBC at the World Economic Forum in Davos this past January.
“I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to rein that back as much as possible.”
“But Adam, that quote doesn’t even mention cigarettes,” I imagine you just declared. Fine:
“Here’s a product — cigarettes — they’re addictive, they’re not good for you, maybe there’s all kinds of different forces trying to get you to do certain things. There’s a lot of parallels.”
If you watch the CNBC interview in its entirety (or just read that quote again a time or two), it’s clear that Benioff is referring to Facebook and its influence, accidental or otherwise, over the 2016 election. He even suggests a fix for the problem … government regulations.
I know that’s a word that makes a lot of people shiver whenever it’s uttered. I’m also pretty sure, as a card carrying libtard (to some extent), I’m supposed to panic at the thought of Trump being the one to finally “crackdown” on the internet.
But nah. Fuck it, I’m fine with that. You should be too, no matter what side you’re on. At least as it pertains to Facebook. Regulations, censorship, nationalization … whatever it takes. Kill that monster while we still have a chance. My suggestion would be to clear out all the employees and burn all its infrastructure and buildings to the ground then study the ruins for the next few decades like Nazi goddamn Germany just so we know how best to make sure it never happens again.
I can negotiate down to regulations though, no sweat.
Think about why we impose regulations. It’s not because they’re cool. No, dumping oil in water is cool. You can set that shit on fire and everything. The problem is it kills people and animals and hurts the economy so … regulations. You can’t dump oil in the ocean. It’s for safety reasons. Even if it’s sometimes a lie, protection is almost always the official company line for why regulations are necessary.
We need to be kept safe from Facebook. I honestly believe that.
I say this for a few reasons, but let’s start with the Big Tobacco comparisons. We always knew cigarettes were bad for us, but we kept smoking anyway. Our bodies our choice and all that. But then, at one point, we found out tobacco companies were intentionally engineering their products to be even more addictive.
Even worse, when we called them out on the dangers, the heads of all the various tobacco companies went in front of God and America and, most importantly, a television audience, to claim they had no knowledge at all of smoking being linked to cancer.
With that, we decided that if tobacco companies weren’t going to take any steps to lessen their product’s destructive tendencies, we’d take steps to do it ourselves. And we did. And it kinda worked. People still smoke, obviously, but not nearly as much as in previous decades. Around 50 years ago, 42 percent of the adult population in the United States were smokers. That’s dropped to around 15 percent now.
And guess what? Tobacco company profits are still fine!
We can do the same thing with Facebook, and it wouldn’t be unwarranted or especially harsh, either. Facebook has done every single thing Big Tobacco did, with the possible exception of giving people cancer, of course. That’s not to say Facebook doesn’t sometimes lead to outright deaths, but it’s at least not a known carcinogen. It does have that working in its favor.
Beyond that, Facebook is Big Tobacco. When Marc Benioff said the tech industry is engineering their products to be more addictive, he wasn’t doling out conspiracy theory talking points. Facebook was designed to be addictive, and the company has a long and sordid history of running experiments on unwitting human subjects in the name of finding newer and better ways to exploit that addiction.
Again, that’s not conspiracy theory talk. One of their most infamous projects was 2014’s “social contagion” experiment. That was a joint effort between Facebook and a few university scientists meant to determine if they could make a user happy or sad just by manipulating what one sees in their timeline. And they totally can!
It’s thought that the inspiration for this experiment came from another in 2012 that impacted 151 MILLION users.
Does that seem ethical to you? Sure, there’s some degree of implied consent in that you agree to Facebook’s Patriot Act-length user agreement when you use the site. There’s also some degree of implied consent to get cancer in the future when you smoke, but we still put warnings on cigarette packs to remind people of the risks they’re taking by using that product.
If Facebook is going to actively manipulate the emotions of its users, what’s the harm in making them put a banner on their site that explicitly tells people this? Make it ugly, make it obtrusive, and make it permanent on every page. Like a picture of a diseased lung on a pack of British cigarettes. If someone makes an app that removes it, throw that motherfucker in prison. Will fewer people use Facebook as a result? Yeah, they sure will, and that’s the entire point.
Also, let’s compare Facebook’s response when “fake news” became a headline to the tobacco industry’s response to links between smoking and cancer. It wasn’t all that different. They just kinda threw their hands up and said, “hey, we’re not a news site, we can’t be expected to police these things.”
Say no more, Facebook. We can take it from here. So that banner I mentioned that should warn users that Facebook might be manipulating them and is definitely watching them? Go ahead and throw some verbiage in there about how everything presented on the site is for entertainment purposes only and should in no way be construed as actual news.
If Facebook doesn’t want to be burdened with the rigors of maintaining a sense of credibility and integrity in what they share with their audience, then strip them of that credibility. Remember those early days when someone would say “I read it on the internet” and the entire room would roll their eyes because we had no way of knowing if what we read on the internet was real without doing layers and layers of extra research?
“I read it on Facebook” needs to be the 2018 version of that. We need to look at people who turn to Facebook for news the same way we now look at people who smoke for relaxation. It’s ultimately your choice, just know that everyone around you thinks it’s a shitty choice and you should change your habits for the betterment of society sooner than later.
From there, we tax some shit. We did that with cigarettes and pumped that money into education programs to warn people about the dangers of cigarettes. What’s to stop us from taxing Facebook transactions and using the money to give people a heads up about the evils of fake news?
Or maybe you think all this talk about Facebook’s influence on the 2016 election is nothing more than noise meant to discredit your guy and his win. Cool, but are you sure of that or are you just saying it because the outcome allegedly benefited your side? Because this really isn’t a partisan issue. A tool capable of unfairly swaying elections at the whim of anyone with enough money to pump into it is bad for all sides. Besides, that’s what we have electronic voting machines for, am I right?!?!?!?
Seriously though, if you’re a Trump supporter, you more than anyone should be concerned about Facebook’s ability to generate votes for one side or the other. If you don’t believe that now, you probably will when Trump is literally running against Facebook in 2020.
Here’s the thing, when Trump took office and we all wanted him to divest from his business interests and he was like, “sure, I’ll sorta do that,” and we just shrugged our shoulders and moved on, we set a precedent. The left had bigger fish to fry and the right didn’t care because they won.
Fine, so what happens when Mark Zuckerberg runs for president? Don’t laugh. People laughed when Trump ran for president and now here we are. He says he’s not running and the media assures us that even bringing it up is absurd, but, and I’m speaking directly to you, Trump supporters … since when do you believe the mainstream media?
What we know for sure is that, in 2017, he took a lot of trips to a lot of states in the middle of the country, and in a way that perfectly mirrors what a person does when they’re running for president. Whether anyone wants to admit it or not, Trump was probably the test case for Facebook’s ability to influence public opinion in an election. Now the guy controlling the machine that made that happen is hinting that he might take a run at it himself.
Remember, I’m speaking mostly from the left here when I say I find that prospect terrifying. No more or less terrifying than the idea of Trump winning again in 2020. I don’t fucking know Mark Zuckerberg enough to trust that he won’t just turn that machine of his on us if he takes control someday. Did I mention that several of Facebook’s patents involve turning your camera on when you don’t know it’s on so they can scan your face and get a sense of what kind of mood you’re in? Same thing with the pressure at which you type a text message before logging onto Facebook.
Trump’s business was real estate. Mark Zuckerberg’s business is surveillance. I don’t trust him to set his cache of data about me ablaze on the White House lawn on inauguration day anymore than I trust that Trump no longer has a vested interest in the real estate market.
Also, and again, speaking to Trump supporters here, it’s worth noting that Trump’s immigration plans and policies are not a thing the tech industry supports. I don’t say that because tech bros are all liberal snowflakes or whatever. I say it because Trump’s earliest policy proposals that were posted to his website, before he was even elected, called out the tech industry for exploiting cheap labor from overseas in the name of profits and selling it to the public as diversity.
In other words, the guy that maybe helped your guy get elected now has a vested interest in making sure your guy doesn’t get elected again. So, how confident are you that we should stop worrying about looking into how outside influences impact elections in this country? If you’re a Trump supporter, are you sure those are dice you want to roll in 2020?
I’m sorry, but the Russia investigation is the only thing keeping Facebook’s penchant for influencing public opinion in the spotlight, and it needs to carry on to its conclusion. If it somehow means Trump has to leave office (which it almost certainly won’t), your side will survive with Mike Pence in charge for awhile. Women and the LGBTQ community might not, but you’ll be fine.
Anyway, I’m really not trying to call anyone out or blame one side or the other. I’m just saying that when the villain lands a death ray in the town square and starts firing it at people, you don’t respond by arguing about how much the people it hit deserved it. You pool your resources and destroy that shit so it doesn’t hit you next, and then you go back to hating each other once the threat has passed.
If Facebook really is capable of swaying an election, that’s a threat to every side and we need to know about it and we need to take steps to fix it. In this world of divisions and conflicts, even if only briefly, it’s time all sides join hands and defeat our common enemy.
So, what do you say, America? Let’s kill Facebook!