Presidential Techlection

NOVEMBER 13TH, 2016 — POST 307

Daniel Holliday
5 min readNov 14, 2016

I was a little out of touch when I wrote earlier in the week about what I saw as a coming tech revolt. I wasn’t out of touch for overstating the point with dramatic tenor but rather understating. Retreating from the varied sources of online information, I had missed Max Read’s piece for New York Magazine in which one of the more influential arguments of this week was condensed. Read’s thesis should be clear from the headline: “Donald Trump Won Because of Facebook”. So as well as lamenting the lack of answers tech can provide in the wake of the results, it seems I would have been justified in saying the whole thing actually was tech’s fault. As Read writes:

“Endless reports of corruption, venality, misogyny, and incompetence merely settle in a Facebook feed next to a hundred other articles from pro-Trump sources (if they settle into a Trump supporter’s feed at all)”

Taking up Read’s piece two days later, Cliff Kuang of Fast Company elaborated that Facebook’s unavoidable influence on the election was two fold: the well-reported surfacing of fake news and the in-built echo chamber of the mollycoddling algorithm. The former is the point under most contention. Facebook CEO Mark Zuckerberg says it’s “crazy” to argue that Facebook had any part to play in the surfacing of fake news even as Breitbart head and Trump campaign CEO Stephen Bannon admitted to Bloomberg that “Facebook is what propelled Breitbart to a massive audience. We know its power.” However, Kuang focusses most on the second point: that Facebook is specifically designed to allow bad information to percolate, gain support, and, most potently, keep people within their own bubbles. Kuang writes:

“A world of instantaneous, dead-simple interactions is also a world devoid of the opportunity to challenge what lies behind them.”

Bemoaning software and service design that is responsible for the loss of cross-partisan discourse cuts both ways, however. Those of us who watched the world crumble through our Twitter feeds on Tuesday night are too arguable victims of the our impenetrable bubbles enabled by “dead-simple interactions”. The sheer disbelief a great swath of the Left experienced was born from a genuine belief that Trump simply didn’t have the support. Those that were able to puncture our bubbles were outliers, crazed individuals responsible for things like forcing Leslie Jones off Twitter. And we knew those outliers wouldn’t be enough. However, behind them was a once-silent “majority” no longer silent, a group we were still unable to hear and a group that ultimately swung the 2016 election in favour of Donald Trump. (We can talk about the whole popular vote vs. electoral college thing, but the point stands: especially in states like Wisconsin, Michigan, and Pennsylvania, we just weren’t listening.)

That we reside in our own bubbles is only the beginning to the difficulty of starting to unpack tech post-election (and I’m still confident this was Big Tech’s first trial). The fact that most have begun to tepidly reenter tech’s streams with little more than a cursory reflection on these products’ value indicates a retreat back into the arms of the scolding parent. The pain that is now felt will be dulled through more tweets, more journalism, more exploitation of the very channels whose exploitation got us here. Because what else is there to do?

As Kuang outlines, “Modern user experience is a black box that has made it easier and easier to consume things, and harder and harder to remake them.” The platforms most in need of overhauling — namely Facebook and Twitter — have proved so powerful as to win/lose an election. There’s no wonder we might be a little powerless in comparison if we were to attempt to wrangle tech back like the period of tinkering in the 1970s that Kuang is referring to, one where computers use was synonymous with computer maintenance. I can’t change Facebook’s algorithm like I can install a custom Linux fork on a computer (I can’t, but that’s at least something that is possible). I can’t, as is one of Kuang’s suggestions, implement something like a “truth verified” indicator to news stories on Facebook, something that would make stories that are verifiably true look different to those that aren’t. These are decisions only Facebook can make for themselves, and are unlikely to given both their dismissal of implied guilt and the ongoing denial that they are functionally a media company and as such have a different set of responsibilities than a tech company.

Writing for Real Life earlier this year, Jesse Barron hit upon something that seems to encapsulate the bind we’re in. Whilst Barron was speaking specifically about the “post-dignity design” philosophy that Silicon Valley embodies (treating us like children), he thinks on what an act of defiance in the face of Big Tech might look like. Barron writes:

We are put in a position where we either embody the forces of repression or we enjoy a Silicon Valley product. What a convenient little elision for the Valley, the seat of real power. They’re not the repressive force; opposing them is. All they want is to let us be as free as when we were kids.

We seem unable to find a gap to set the wedge of criticism when wishing to reevaluate the value of technology. We’re either outside, talking in the language of a tin-foil hat wearing conspiracy theorist, or we’re inside unable to see the forest for the trees.

If you enjoyed this, please take the time to recommend, respond, and share this piece wherever you think people will enjoy it. All of these actions not only help this piece to be read but also let me know what kinds of things to focus on in my daily writing.

Thanks, I really appreciate it.

--

--