Technology as Theory
Or, why Science Isn’t As Fucking Awesome as we think it is
The most important paper I’ve read in my undergraduate career was Donna Haraway’s “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” This paper has become an important part of my intellectual work for a number of reasons: it provides a clean solution to “what we might mean by the curious and inescapable term ‘objectivity,’” while at the same time acknowledging and combatting the dangerous and combative rhetoric so often found between science-lovers and science-haters. It’s style is easy and fun, in contrast to the boring stuff I’m used to encountering. More symbolically, however, it hit me at a time when I was just coming to terms with the fact that there might be more than one kind or source of objective truth — something I hadn’t allowed myself to fully comprehend prior to encountering that piece and the related pieces I would soon encounter.
I was thinking about this paper again when I ran across a tweet from venture capitalist/M’tech bro hero Paul Graham:
It’s very tempting to want to get out of this argument by clinging to pragmatism. This is a view I’m sympathetic to, and, even after many years of investigation, still sometimes cling to. The world just is, isn’t it? Spending years working together a project of ontology — explaining how something (usually the world) comes into being — is a boring and petty waste of time when I’m pretty certainly that at the very least cogito ergo sum; I think, therefore I am. And, to be fair, many of the claims philosophy were constructed by bored white guys so concerned that they be right that they cling to esoteric, bloated, and abstract language in a way that is interesting to philosophers but is far too obtuse for anyone else. So it’s easy to reject it out-of-hand as too esoteric or too unpragmatic. This is a simple-minded move, however, because playing the pragmatic card immediately forces us to consider the world as not near as complex, interesting, or full of deviant possibilities as it could possibly be.
That’s dangerous to us as technologists — we are constantly in a battle to attempt to reimagine the world and to solve all of its complex problems in interesting, beautiful, and wonderful ways. Rejecting this kind of philosophical work as unpragmatic — though it may be entirely too dense and esoteric — is immediately contrary to our mission: making the world a smarter, better, place. To do that kind of work we must begin to acknowledge and appreciate the complexities of the world: if we reduce our worldview to a single, static, lens, we miss out on the wonderful opportunities afforded by accepting the views of many communities of knowledge.
More generally, this kind of thinking goes against and misrepresents our job as technologists — or, as some would call us, applied scientists. There’s a lot a debate over what a technologist even is — but if we are to hold to the “applied science” definition, it becomes incredibly crucial that we fully understand and appreciate science itself. To be fair to science, within science itself, there’s never been a claim to absolution. Scientists work on proving or disproving theory in a repeatable and objective way. The idea is that if you repeat your experiments enough and if you come to the same conclusion every time, maybe then you’ve got your hands on some truth. The problem with this — and where we must turn to philosophy — is that science itself can’t claim that simply because you’ve done something a lot of times in a lot of places doesn’t mean you’ve got yourself a truth. There’s no link that innately makes science the One True Absolute Way of Knowing.
Science itself, of course, is very new. Some critics — specifically, feminist and Marxist critics — want to say that science is actually a direct result of a turn to patriarchal capitalism that has violently and through force destroyed others ways of knowing. You see, in order to science, you must assume that the world is simply all there is, that it is objective, that it can be manipulated by mankind in fruitful ways, and that it is atomic and predictable. These are cozy assumptions that make our work as technologists a ton easier. A lot of post-Marxist thought is dedicated to disputing these claims by investigating the ways in which capitalism forces us to reduce the world to atomic, repeatable, knowable bits, stripping it of complexity. Though I sympathize, I have to admit I find these claims a bit far-fetched. That being said, however, they do point to something important: science itself is constructed, and it is merely a method that attempts to point at that objective truth. Whether it gets there or not is not science’s concern.
Thus we must treat every piece of technology we create as a theory — as but one way of understanding and interacting with the world. It is not the only way. Saying anything is the best anything requires we rely on quite rocky and disputable definition of what we can consider best. We must reposition our work as technologists as only the work of theorists at play. The good news is that — far from esoteric — this becomes a rather productive approach to truth, too. It allows us to envision the ways in which we might understand or interact with the world in ways that allow us to explore and build our world in different and intriguing ways. These possibilities are not merely tangential or accidental. Simply by changing our social location or embracing a new community of knowledge we can begin to learn how to create interesting technologies in different ways that serve different purposes. I think those possibilities are incredibly awesome, and not something we think about often enough in our consumer tech-obsessed tech bubble.
But this process begins by us rejecting the notion that truth is absolute, that is is objective, and that our approach to truth is the only approach to truth. There are many ways we as technologists can embark on a project like this: reading books, going back to school, learning languages, or moving to a foreign country (if you want to join me in Mexico City, holler at me here. )
Ironically, of course, this all means that Paul Graham may have been accidentally right, after all. We cannot accept the orthodoxy given to us as technologists. We whisper to ourselves that we must learn to think different in order to do our work, yet cling to traditional ideas of truth and objectivity. Isn’t it time we started learning what “think different” really means?