My thesis, to put it bluntly, is that from late neolithic times in the Near East, right down to our own day, two technologies have recurrently existed side by side: one authoritarian, the other democratic, the first system-centered, immensely powerful, but inherently unstable, the other man-centered, relatively weak, but resourceful and durable. If I am right, we are now rapidly approaching a point at which, unless we radically alter our present course, our surviving democratic technics will be completely suppressed or supplanted, so that every residual autonomy will be wiped out, or will be permitted only as a playful device of government, like national ballotting for already chosen leaders in totalitarian countries. — Lewis Mumford
Urbit is difficult to understand because it combines multiple things that are normally separate from each other. It is a ‘computer’ or ‘operating system’ in certain technical senses of those terms, but it also includes its own programming languages and identity system. At a practical level, it’s perhaps easiest to understand it as a platform that provides new primitives for building peer-to-peer applications. Urbit also represents a particular community of self-selected weirdos who — for whatever reason — have elected to learn enough about its technology to join and participate in the network.
But Urbit also represents a novel critique of our current Internet stack, which sensitizes us to the largely imperceptible ways that seemingly mundane technology design choices can nudge us towards centralization, and in the limit, authoritarianism. These ideas are what initially drew me to the Urbit back in October 2021, and is what I believe will have lasting value even if Urbit itself, as a platform, fails in its efforts to remake our Internet.
This series of posts has two purposes. First, they represent my attempt to think through and articulate the substance of Urbit’s crtique of our present-day Internet, both to clarify my own thinking on the subject and put it into a form that is accessible to non-Urbiters. In the last eighteen months, I have absorbed this critique in reading and talking with people on the network, as well as via promotional materials provided by Tlon (including the Understanding Urbit Podcast) and other Urbit-adjacent creators. Second, my belief is that Urbit’s critique, while valuable, is also incomplete, in that its focus on how technological choices have promoted Internet centralization ignore the political-economic factors that have shaped the Internet’s evolution. Without addressing those political-economic factors, Urbit won’t succeed in its mission.
Authoritarian and Democratic Technics
To understand Urbit’s critique of our current Internet, I find it helpful to begin with a distinction that the 20th century historian and sociologist Lewis Mumford took up in his work. Mumford was interested in understanding the broad historical evolution of technology and its impact on society. He was not only interested in specific technologies but, more broadly, what he called “technics”: the systems of organization, knowledge, values, and beliefs that underpin their use. As the epigraph to this post indicates, Mumford identified a broad struggle that has played out between two broad classes of technics throughout human history: one that is “democratic” in nature and one that is “authoritarian”,¹ which roughly map onto the more contemporary terms “decentralized” and “centralized”.
For Mumford, democratic technics are small-scale and artisanal. They were our original technics of economic production and tool use; their emergence coincides with the earliest use of tools by homo sapiens. Where technology is used it is always “under the active direction of the craftsman or farmer”. Democratic technics have more limited capabilities than authoritarian ones, but they are more adaptable and resilient. We can think of democratic technics as being “antifragile” in the Talebian sense and provide us with what Ivan Illich called “convivial tools”: that is, tools which “maximize individuals’ ability to envision and realize futures of their own choosing”.² Indeed, Mumford comes close to Illich’s notion of conviviality when he writes that:
The best life possible — and here I am consciously treading on contested ground — is one that calls for an ever greater degree of self-direction, self-expression, and self-realization.
In modern crypto parlance, we might say that ‘democratic’ technologies are permissionless and open, rather than permissioned and closed. Drawing further afield, we can think of democratic technics as facilitating what the anthropologist Claude-Levi Strauss called “bricolage”: self-directed, ‘do-it-yourself’ experimentation with materials at hand, in contrast to rationally organized engineering.³ Moreover, democratic technics are embedded within, and facilitate, a democratic form of life. For Mumford, this consists of, among other things:
[C]ommunal self-government, free communication as between equals, unimpeded access to a common store of knowledge, protection against arbitrary external controls, and a sense of individual moral responsibility for behavior that affects the whole community.
By contrast, we can think of authoritarian technics as exhibiting — and even requiring — hierarchy, systems of expertise, and centralized control. As such, authoritarian technics emerged much later than democratic technics, initially with the rise of the earliest grain states in around 4,000 B.C. as hunter-gatherer populations came to be settled at specific sites that later became ancient city-states. Early states required canals, the use of forced labour, knowledge of astronomy (to predict harvesting times), and large scale coercive power to support these systems through the imposition of taxation on subject populations. Building on Mumford’s insight, we can further say that authoritarian technics both depend upon and facilitate standardization and legibility. As the anthropologist James Scott has noted, early states tended to arise in areas that could support the cultivation of a grain crop, like wheat, which could be easily appropriated, measured, and stored, thereby simplifying taxation.⁴ Likewise, these states depended upon a subject population that could be easily administered and mobilized. In Herodotean terms, authoritarian technics are the Persian Empire while democratic technics are Athens and Sparta.
While powerful, authoritarian technics are fragile, in contrast to the anti-fragility of democratic technics. As Scott notes in his analysis of early grain states, they were exceptionally prone to collapse, both economically and ecologically. This was, in part, due to the fact that settling in a specific place meant bringing together plants, livestock, and humans together into close proximity for the first time, which induced the development of new zoonotic pathogens, such as Influenza. Ancient states were also fragile because, according to Scott, their subject populations often didn’t wish to be there. Settling in the ancient domus meant giving up one’s freedom often in exchange for increased risk of disease, poorer nutrition, and more inequality. Early states thus didn’t face the problem of barbarians invading them so much as their subject populations wishing to abandon them.
Technologies organized under an authoritarian technic can deliver enhanced performance compared to those organized around a democratic technic. They can be rationally designed, planned, and implemented by skilled engineers and often exhibit constant or increasing returns to scale. As such, the use of authoritarian technics has closely tracked the growth of civilization. However, the adoption of authoritarian technics entails a concomitant loss of freedom. While they are generally not used to impose outright slavery and forced labour on subject populations any longer (at least in the West), authoritarian technics necessarily induce a dependency relationship between a society and a relatively small group of scientific or technical experts with a necessary loss of freedom on the part of the latter. There is no way, at present, to organize the production of nuclear power in a ‘democratic’ manner. Off-grid solar, by contrast, fits within a democratic technic, but can’t match the electricity output or reliability of the grid.
Even as states have generally become less coercive, authoritarian technics have enabled new forms of centralized power and control. As the sociologist Michael Mann has noted,⁵ we have generally seen a reduction in what he calls the ‘despotic power’ of states over time: that is, the ability of a monarch to arbitrarily decide to execute his or her enemies. In its place, though, we have seen a substantial increase in states’ ‘infrastructural’ powers: to do things like tax income or wealth at the source, store and recall massive amounts of information about their subject populations, or directly regulate economic activity itself. The authoritarian technics of the state penetrates every day life to a degree that would have been unimaginable even a century ago.
Ironically, according to Mumford, the use of authoritarian technics intensified under the growth of mass democracy, particularly in the twentieth century, and came to threaten the spirit of democratic life in all but the most formalistic sense. As individuals became more politically equal, systems of power that had previously existed in the political sphere were reconstituted technologically. Thus:
[A]t the very moment Western nations threw off the ancient regime of absolute government, operating under a once-divine king, they were restoring this same system in a far more effective form in their technology, reintroducing coercions of a military character no less strict in the organization of a factory.
In place of kings, society was ruled by experts whose power was bound up in their growing capacity to automate larger and larger numbers of economic and social processes.
According to Mumford, the intensification of the authoritarian technic has come about in the form of a “magnificient bribe”:
[E]ach member of the community may claim every material advantage, every intellectual and emotional stimulus he may desire, in quantities hardly available hitherto even for a restricted minority: food, housing, swift transportation, instantaneous communication, medical care, entertainment, education. But on one condition: that one must not merely ask for nothing that the system does not provide, but likewise agree to take everything offered, duly processed and fabricated, homogenized and equalized, in the precise quantities that the system, rather than the person, requires.
Punch Cards, Mind Bicycles, and Virtualized Empires
Writing in 1964, Mumford saw computer technology as a key component of the emergent twentieth century authoritarian technic. For him, “the inventors of nuclear bombs, space rockets, and computers are the pyramid builders of our age”. This view of computing was not out of sync with popular conceptions of that era, or with the history of computing up until that point. In that same year, members of UC Berkeley’s Free Speech Movement famously marched on campus wearing IBM punch cards around their necks to protest what they saw as dehumanizing treatment of students by Berkelely’s administrators. Mario Savio, a leader of the FSM, gave an interview where he explained that he and others felt that “At Cal, you are little more than an IBM card”.⁶
But something curious happened in the six decades following 1964. In the span of a single human lifetime, computers first went from being an impersonal technology of bureaucratic rationalization — in other words, a component of Mumford’s authoritarian technic — to a new technology of personal liberation, and then finally back again to an impersonal technology of control in the present day. By 1980, a mere 16 years after Mumford and the FSM, Steve Jobs had famously declared that the Macintosh was a “bicycle of the mind”, a tool for amplifying human cognitive abilities. In interviews he gave at that time, he explained that humans are fundamentally tool-makers, imagery similar to Mumford’s own description of democratic technics. And the bicycle is, perhaps, the quintessential ‘democratic’ technology in Mumford’s sense of the term: one can own it in a deep, meaningful sense, and use it facilitate a ‘greater degree of self-direction, self-expression, and self-realization’. Jobs’ imagery reflects a broader cultural shift that occurred in the Bay Area in the years after 1964, which centred around Stewart Brand and the Whole Earth Catalog. These developments led to a hybridization of hippie and libertarian culture that came to be known as the Californian Ideology, where information technology came to be seen as enabling “a new Jeffersonian democracy where individuals could express themselves freely within cyberspace”.⁷ Underpinning this culture shift was, of course, the rise of the Internet, first at a handful of government agencies and universities and eventually to the broader public.
To some extent, Jobs’ imagery was always somewhat fake. While there was a brief era of “Mac Clones” in the 1990s when Apple licensed Mac OS to generic hardware manufacturers, Apple has otherwise always maintained tight control of both its hardware and software. In that sense, the Macintosh was very different from the bicycle, which could be fixed and maintained by an individual. Even in the 1990s, using a Macintosh was a bit like using the power grid: you could treat it as neutral infrastructure and build on top of it in whatever manner you wished, but you weren’t permitted to tinker with it yourself. Yet by the early 2000s the free software movement was in full steam, computers could be democratic technologies in a genuinely Mumfordian sense, particularly if one was the sort-of person who was willing to pay the cost (in terms of time and frustration) of running open-source software on generic PC hardware.
Today, however, we see computers less as an enabler of self-directed human agency and more as a centrally organized force outside of any individual’s control, or increasingly, any human’s control. It is a daily battle to resist the systems of algorithmic control under which we live, which attempt to shape our behaviours and preferences in increasingly bizarre ways in order to monetize our attention. Moreover, it is practically impossible to ‘own’ the software or data that we use on a day-to-day basis, both of which exist on cloud-based servers that can only be accessed in heavily mediated ways.
Our online lives are shaped by a handful of large online platforms — the megacorps — which behave less like neutral infrastructure or tools and more like empires. To be more specific, we might think of them as ‘virtualized’ empires, in both the technical and colloquial sense of the term. On the one hand, they possess and control sovereign territories (e.g. platform-specific newsfeeds and usernames) that are real but are exclusively virtual, in contrast to the physical territories of real-world nation states. On the other hand, they are “virtualized” in the sense that a cloud server is virtualized: they quite literally run on top of virtual servers operating within global datacenters. Much like the creators of the Great Wall of China, virtualized empires create walled gardens that limit composability to keep monetizable subjects (e.g. taxpayers) in as they were designed to keep bad actors (e.g. “barbarians”) out.
The growth of these virtualized empires has been largely responsible for enabling the Internet to be accessible to the majority of humanity, but their growth has had serious downsides. Looking through Mumford’s description of the properties of democratic life, it is striking how many have been eroded, directly or indirectly, by the centralization of our digital lives onto a handful of online platforms. Instead of communal self-government, democratic elections have become the target of targeted disinformation campaigns. In a misguided attempt to prevent the former, the APIs of once nominally open platforms like facebook have been locked down, leaving control of the platform itself to a small technical elite. Platforms no longer function as a common store of knowledge; they now censor information or ideas that go against expert consensus. Platforms incentivize sociopathic mob behavior on the part of their users, in contrast to the individual moral responsibility required of individuals in the ideal of democratic life. The users of online platforms are more like the subject populations of ancient grain states, whose labour and attention are monetized to enrich the king.
In many respects, the history of the Internet thus represents a speed run of the broader history of humanity. There was a long stretch of time, following the creation of ARPANET and later the NSF Net, when what we now call the Internet was sparsely populated, much like the early history of humanity when the world was populated by largely egalitarian hunter-gatherer populations. Like ancient states, the rise of the virtualized states and empires brought about considerable population growth (i.e. user growth), but at the expense of the equality, freedom, health, and happiness of individuals.
In the next post, I will explore several explanations for the growing centralization of computing since the 2010s.
~tonned-fampet
References
¹ Mumford, Lewis. ‘Authoritarian and Democratic Technics’. Technology and Culture 5, no. 1 (1964): 1–8. [https://doi.org/10.2307/3101118](https://doi.org/10.2307/3101118).
² Illich, Ivan. Tools for Conviviality. London: Marion Boyars, 2001.
³ Levi-Strauss, Claude. The Savage Mind. Second Impression edition. Weidenfeld and Nicolson, 1968.
⁴ Scott, James C. Against the Grain: A Deep History of the Earliest States. Reprint edition. New Haven London: Yale University Press, 2018.
⁵ Mann, Michael. ‘The Autonomous Power of the State: Its Origins, Mechanisms and Results’. European Journal of Sociology / Archives Européennes de Sociologie / Europäisches Archiv Für Soziologie 25, no. 2 (1984): 185–213.
⁶ Turner, Fred. From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. Illustrated edition. Chicago, Ill: University of Chicago Press, 2008.
⁷ Barbrook, Richard, and Andy Cameron. ‘The Californian Ideology’. Science as Culture 6, no. 1 (1996): 44–72. [https://doi.org/10.1080/09505439609526455](https://doi.org/10.1080/09505439609526455).