Facebook bias isn’t the problem. This is ⬇️

Michael Marinaccio
People Over Product
8 min readMay 21, 2016

--

Am I the only one tired of everyone losing their marbles over Facebook?

Following accusations that Facebook was censoring conservative news, Glenn Beck concluded that the only disturbing part of his private meeting with Facebook was the ingrained suspicions of his fellow colleagues.

Suspicion of Silicon Valley is commonplace on the political right: a fear of liberal tech giants who wield immense intellectual power over the growing digital sphere. Exemplifying this fear, Robert Epstein wrote of his concern that Google could rig the 2016 election: “given Google’s strong ties to Democrats, there is reason to suspect that if Google or its employees intervene[d]… it [would] be to adjust the search algorithm to favor Hillary Clinton.” This idea is alarming and its gravity certainly captures our imagination. But controversies like these are insignificant compared to a threat already underway that we hardly notice.

“If we don’t understand the commercial, political, intellectual, and ethical motivations of the people writing our software, or the limitations inherent in automated data processing, we open ourselves to manipulation.” -Nicholas Carr

Build a church and they will come

There exists in society an invisible religion that surrounds our daily lives. Nobody wants to talk about it (so no, it’s not gluten-free, crossfit, or vegan).

It is a church dedicated to the worship of technology and the stripping away of the old. Ironically, the very people selling us the technology — the Steve Jobs or Larry Pages of the world — are also the ones crafting the philosophical and social standards (the hype) that demand those technologies.

Virtual reality, artificial intelligence, neural networks — these are all tech-evangelist prophecies to further promote and sell goods that consumers are told to believe they want. This is not some conspiracy — it is a reality we live in that has put marketing on steroids. The old, “Want someone to buy your product? Convince them they need it” is now “Convince them they can’t possibly live without it.

This is why the Facebook bias story is so comical to me. To suspect that Facebook or Google would seek to censor or prioritize certain lines of thinking misunderstands their core philosophy. In reality, they don’t have a stake in either side. Their philosophy does not aim to improve, change, or manipulate human thinking.

Their philosophy aims to replace human thinking — and charge you for it.

Technology’s philosophy of efficiency

Facebook and Google’s core philosophy was shaped 150 years ago. Nicholas Carr explains in The Shallows that if Silicon Valley had a founding father or philosopher, the honor would go to Frederick Winslow Taylor.

He was the young man who first brought a stopwatch into a steel plant. By minutely dissecting “every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions — an “algorithm,” we might say today — for how each worker should work.” He created a perfect model for working.

A century later, “Taylor’s system of measurement and optimization is still very much with us; it remains one of the underpinnings of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual and social lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient, automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the ‘one best way”’— the perfect algorithm — to carry out the mental movements of what we’ve come to describe as knowledge work.”

“Google’s headquarters, in Mountain View, California — the Googleplex — is the Internet’s high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is ‘a company that’s founded around the science of measurement,’ and it is striving to ‘systematize everything’ it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day… and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.”

Our brains are not computers

Taylorism imbued a century of philosophical thought with the belief that the human mind worked like a machine and was to be tamed efficiently. Epstein explains that not only were we wrong then, but every new invention made us wrong again.

In the “1500s, automata powered by springs and gears had been devised, eventually inspiring leading thinkers such as René Descartes to assert that humans are complex machines… By the 1700s, discoveries about electricity and chemistry led to new theories of human intelligence — again, largely metaphorical in nature. In the mid-1800s, inspired by recent advances in communications, the German physicist Hermann von Helmholtz compared the brain to a telegraph.” Each brand new explanation of the brain only “reflected the most advanced thinking of the era that spawned it.” We genuinely seem to have forgotten it is a metaphor.

So too now, computers have become the metaphor for how we see ourselves and the mantra for big tech company sales pitches. But it’s just not true. Epstein argues that the differences are clear. Computers “operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.” Humans, on the other hand, do none of these things.

It’s one thing to use technology to help understand the mind. It is another to develop technology based on the assumption that the metaphor is science. The point when we begin using technology more like a map than a metaphor is when we find ourselves trapped, slaves to its creators.

David J. Walbert explains that “the internet was supposed to bring decentralization of power, but in fact it’s consolidated power in the hands of whatever company manages to build the first and/or best standard.” In his example, an automatic coffee maker is a wonderful invention if you are willing to forever give up your knowledge of how to grind coffee manually.

If our brains are nothing like computers, then the implications tear apart the very foundations of what we believe and why we need t0 buy and connect with the vast spectrum of technological things. If Taylorism is false and the brain is not a computer, the philosophy that Facebook and Google want you to believe falls apart. They can no longer ardently sell their products on dogma:

  1. If our brains are not computers, then our minds (actions, and decisions) cannot be replaced by computers or artificial intelligence.
  2. If our brains are not computers, then our ability to manage and multi-task hundreds of apps with thousands of data points becomes a tedium instead of an aspiration.
  3. If our brains are not computers, then attempting to connect, index, and automate them will have dire unintended consequences to society — the worst of which is damaging our minds.

Mental atrophy is the future

Is Google Making Us Stupid? The Impact of the Internet on Reading Behaviour

It tickles me whenever science fiction depicts human beings in 500 years with tiny bodies and giant brains. The reality is, as machines replace our learning, the opposite is happening. Inherent in any use of new technology is an abdication of certain prior functions. When you grip a hammer, you can no longer catch a ball. Or when you look through a telescope, you can no longer sense things nearby.

“Actually, it works the other way. The more accurate the machine gets, the lazier the questions become.” -Amit Singhal, Google

Carr points to an excellent example of this numbing in search engines:

“Google acknowledges that it has even seen a dumbing-down effect among the general public as it has made its search engine more responsive and solicitous…We might assume that as Google gets better at helping us refine our searching, we would learn from its example. We would become more sophisticated in formulating keywords and otherwise honing our online explorations. But according to the company’s top search engineer, Amit Singhal, the opposite is the case. In 2013, a reporter from the Observer newspaper in London interviewed Singhal about the many improvements that have been made to Google’s search engine over the years. ‘Presumably, the journalist remarked, ‘we have got more precise in our search terms the more we have used Google.’ Singhal sighed and, ‘somewhat wearily,’ corrected the reporter: ‘Actually, it works the other way. The more accurate the machine gets, the lazier the questions become.’”

In the same way, by abdicating our critical thinking to streams, filters, and algorithms that present us only with content we’ll “like,” we can make ourselves susceptible to confirmation bias and uninformed choices — to dependency. The Wall Street Journal demonstrated this dependence in their side-by-side, Blue Feed, Red Feed, depicting the stark differences in content based upon your political affiliation online.

Who needs Facebook to maliciously bias their trending topics or news feed when we are doing just fine by ourselves?

Facebook isn’t the problem. We are.

Our mindless adoption of the culture of technology has become, as Carr describes, similar to how “indoor plumbing became invisible, fading from our view as we adapt ourselves, happily, to its presence.”

The strategy is fanaticism and the agenda is dependency. The more concepts that Silicon Valley floats as absolute necessities to human existence, the more they can insert themselves as power brokers between our lives and knowledge. Brilliant as always, the Onion spells it out clearly, “Facebook Clarifies Site Not Intended To Be Users’ Primary Information Source” as if they’d have it any other way.

We need to critically evaluate how we use media and what types we indulge in. It is a fool’s dilemma to pit extinction of technology in our lives against the feverish embrace of all technology. This article is not meant to be a dig at technologists or technology, but the dogmatic ideology and those who blindly adhere to technology’s drumbeat.

Let me be clear: the ways in which technology have improved the material existence of humanity — our longevity, convenience, and standards of living — is real. But we shouldn’t mistake that material improvement for mental improvement. The threats to knowledge are just as real.

If you like what you read be sure to ♥ it below. Stay in touch by subscribing to my newsletter or following me on Twitter.

Further reading:

Is Google Making Us Stupid? by Nicholas Carr

The Shallows: What the Internet Is Doing to Our Brains by Nicholas Carr

The Empty Brain by Robert Epstein

How Google Could Rig the 2016 Election by Robert Epstein

--

--