Real Problems, Bad Data
This past summer SourceFed, a news channel that operates on YouTube, released what they thought was proof of Google using their powerful search engine to attempt to manipulate search results for Hillary.
The video is professionally edited, with several apparently damning screenshots that claim to show that Google is censoring negative stories about Hillary while Bing and Yahoo show them. It got a lot of press and quickly went viral.
The problem is, the evidence is wrong.
See, SourceFed didn’t consult with anyone who worked in search before publishing their peace, they didn’t reach out to Google for comment. They saw something that they thought was important and ran with it. Over the next several days, several search experts who are far smarter than I weighed in, including Google’s own Matt Cutts.
SourceFed isn’t a news organization. While they try investigating stories, tech isn’t their thing and the reason the video went viral has more to do with the fact of the claim and not the evidence.
The endless news cycle eventually pushed the story out of the trending topics and this political idiocy we call an election marched on. Then Robert Epstein published a new “study” based on SourceFed’s video that tried making the same argument.
Dr. Epstein is a research scientist, so his experiment was far more involved than SourceFed’s, but like them, he didn’t contact anyone who worked in the industry to better understand autocomplete (I confirmed this with him).
So his experiment had the same problem: The evidence is wrong.
I’m not going to spend my time debunking his conclusions. For that, you should read this amazing piece by iPullRank, who is an expert in the field. What I want to talk about is Dr. Epstein’s main concern, that a company like Google, Facebook, or Twitter, could potentially sway public opinion. Because he’s right.
I just feel “research” like his does more harm than good.
Billions Of Bubbles
One of the amazing things about the internet is that it makes it easier than ever to discover new things. Looking for a cool new soundtrack? Check out what’s trending in Korea. Want to understand the world around you better? Read a newspaper written on another continent in another language, and your browser will translate it for you.
But the internet also makes it easier than ever to avoid having our conceptions challenged. Even if you live in the middle of nowhere, online you can connect instantly with a community of thousands of people who think just like you do. You have somewhere to fit in that doesn’t force you to evaluate your beliefs. This is the “echo chamber” effect that we experience online, where everyone cares about what we care about, even if people you know in real life think it’s a minor issue.
The problem is, that the echo chamber is lucrative. In his article, Dr. Epstein referred to confirmation bias, or how we were more likely to select things that already aligned with what we felt, what we already feel to be true. This is why, if you talk with someone who is politically opposite of you, you discover that you likely don’t read the same websites, or even accept the same facts as true. The Wall Street Journal displayed this brilliantly by showing just how different the traditional progressive and conservative news feeds are on Facebook.
If you want to look at something that’s having a noticeable impact on this election, looking at the news media is a great place to start. But that’s another topic for another time.
Google, Microsoft, Facebook, and other online services are in the business of making money off of you when you use them. They do this by getting you to keep coming back to their service because you find what you’re looking for. The easiest way to do this is to start personalizing the results you see so that they’re more likely to be relevant to you, even if those results are not what they’d otherwise show.
This means that the more you use the service, the more the results are tailored to your bias, your previous reading history, what the services think you’ll love. And they are very good at knowing what you’ll love. Their entire business model is based around it. Eventually, your results won’t be like people who don’t agree with you, and you’ll be in what Eli Pariser calls the Filter Bubble. Your current reality will literally be shaped by your past preferences, making it harder for you to be exposed to new ideas.
This is a serious problem. DuckDuckGo, an alternative search engine, is based on the idea of not giving you personalized results in part to help you escape the filter bubble. Digital marketers, like myself, are also concerned about this bubble. Selfishly, it makes us harder to reach broad audiences, but we’re also more likely to recognize this bubble and be concerned about its possible effects.
Which brings us to the core question of Dr. Epstein’s multiple studies: Could Google (or another company) manipulate users, even in something like an election?
The Real Problem
The short answer, is yes, theoretically. Facebook revealed in 2014 that they performed an experiment to see if they could manipulate someone’s emotions by controlling the types of posts they found in their news feed.
They could, though the change was minimal in this experiment. After testing their theory on more than half a million participants, they found that manipulating the types of posts someone saw in their news feed made an average of one fewer post expressing the emotion opposite to the one they were shown.
But this less-than-impressive study doesn’t mean we should ignore the results. A recent Pew Research study found that 38% of American adults got their news from online, second only to TV (56%) as the primary source of information. Data companies learn more about us every day, and they’re getting better at understanding not only our actions but our motivations as well.
Most importantly, they will only rarely tell us how they figure out what they do. Trying to figure out Google is build business, with companies on track to spend $65 billion on Search Engine Optimization (SEO) in 2016 alone.
There is a lack of transparency in big data, which is one of the reasons it’s so scary, even for those of us who are around it every day. We need to get a better understanding of how Facebook, how Google work, and how their actions influence us.
That’s what makes Dr. Epstein’s “research” so dangerous. It’s bad data, and it distracts from the real issues. Instead of discussing the very real problems of filter bubbles, personalized search, or potential manipulation, we get screenshots of Google Trends and low volume keywords that he feels are a smoking gun.
It distracts from the actual problems, it buries the knowledge we do have beneath conspiracy theories that look great for a single news cycle and then fade from existence.
Bad Data is Worse Than No Data
When Dr. Epstein published his latest study, I asked him on Twitter if he consulted with anyone in digital marketing or reputation management (two industries where you need to understand how Google works to continue making a living) before publishing his piece. His response was telling.
I’m not sure what you mean. Why would consult such people? I just do research. — Dr. Epstein
When other SEO experts chimed in, Dr. Epstein either ignored their criticism, insisting that they “read his piece” to see the new evidence he found or said that they could not be impartial because most of their revenue came as a result of Google.
This is true, and I’ll cover it more in a supplemental post, but the reality is that people who are working in jobs where their income depends on them understanding how Google works are exactly the people you should consult when you’re trying to report on any perceived biases in the system.
We spend hours every week pouring over studies or conducting our own. We try to figure out how the search engine works and how people interact with it. Yes, this makes us money, but it also makes a lot of people the closest thing you’ll find to Google experts outside of the company.
One of the things the popularity of the SourceFed video and Dr. Epstein’s study reaffirmed to me is that my industry does a terrible job about explaining what we do, but more importantly why we do it. Online search is complex, and it’s constantly evolving. Understanding how it impacts our lives is important, and it’s a conversation we need to have. But we need to ground that conversation in accurate data.
In marketing, having no data is preferable to having bad data because you can make decisions on bad data, you can act. Bad data distracts you from finding solutions because solutions built on bad data are unlikely to address root problems.
I don’t disagree with SourceFed or Dr. Epstein because I’m trying to say what they’re implying is impossible. I disagree because I think it is possible and it’s something we need to discuss. But when you come to conclusions based on incorrect or uninformed assumptions, they’re often the wrong ones.
I believe that privacy, security, and transparency are the biggest issues in the digital space and I’m interested in solutions, in data. Not exclusives.
Disclaimer: I work in the search marketing industry. This post reflects only my opinions and does reflect the thoughts of my employer, other SEO’s, or the industry at large.