Silicon Valley Has Failed Us

Today’s news about Google and Facebook is merely the neon sign of their systemic failure.

Google’s top news return for a search about the 2016 election results leads to a fake news site with false numbers. Facebook had a fix for all of the fake news sites flooding people’s timelines, but was afraid of backlash because of how Trending Topics was allegedly filtered based on a “liberal bias.”

This is what happens when we take morality out of the equation. This is what happens when we let algorithms completely take over. We like to act as if technology and data exist in a vacuum, with no bias existing because a computer is running the show. Unfortunately, what you see above is just the giant flashing neon sign in the night, telling us that Silicon Valley has failed society in a fundamental way.

These Google algorithms that are supposed to give us the truth? Yeah, they’re based in part on your search history and partly on page views, so people sharing those fake news sites are unduly influencing said algorithm. Facebook’s algorithm for the News Feed that is supposed to stick to friends and family? It is heavily influenced by the people you most recently interacted with, so if you spend a week only talking to a certain few people, your feed will start to shrink, and you’ll have less contact with certain friends and family. Over time, if you don’t act to reach out to people, your circle grows smaller, and the amount of views you’re exposed to diminishes. This doesn’t even begin to account for all of the made-up nonsense that people credulously share because it sounds good, even though it probably came from a teenager in Macedonia.

For a social network, this sounds decidedly antisocial.

Facebook and Google are the new gatekeepers. So much information flows through them. Google isn’t just a search engine, it’s the literal term for searching the Internet. That power requires more than a technocratic view. It requires a human touch, human judgements, humans looking at something and applying their algorithms in a manner that does the right thing. In the end, such things are beyond political affiliations, because people of any political stripe are just as open to being bamboozled, so yes, algorithms should be deployed to filter out fictional “information.” Algorithms should be used to ensure that we are all communicating with each other, instead of just segregating us into cones of silence, where nothing outside of it penetrates.

So, when I hear that Facebook had a fix for this, and they didn’t implement it because someone might get upset and stop using Facebook, a social network that gives you unlimited storage space and lets you find people at zero cost, it infuriates me. It is amoral behavior. It is the opposite of “doing no evil,” because it has been clear for a very long time that people are quite naïve about the Internet (witness how many people are taken in by chain emails or Nigerian princes) and to ignore the proliferation of fake news, to let it go and allow fiction to masquerade as truth, because it might hurt someone’s feelings, is irresponsible. When I see Google using an algorithm that takes into account a person’s own biases to determine their search results for news, that is irresponsible. No, you cannot force people to see what they do not wish to see, but when you take the choice away, it’s nanny-statism in private sector clothing. It is the exact opposite of the libertarian ethos that has permeated Silicon Valley, where “freedom” is a buzzword, and “responsibility” is left behind.

I wasted a lot of time and energy this year trying to correct a torrent of biased and/or fake “news” with people, and not a single mind was changed. We’re already conditioned to believe whatever fits in our worldview, and so few of us bother to seek information that contradicts it, and the tech bros of the Valley have made it so we are unlikely to find anything that doesn’t already fit our Google-searched, Facebook-commenting algorithms.

Congratulations, guys. I hope you enjoy the next few years as truth becomes a meaningless concept, all because you couldn’t be bothered to have ethics inform your decisions.