Rise of the Immutable Operating System

Mark Nadal
8 min readNov 10, 2014

--

Has software really improved in the last 5 years? My thesis is no, and I’ll show you why and how we can escape it.

Early 2002

This was the forum I frequented, captured by the Wayback Machine in 2002. It was powered by phpBB, which honestly was quite good. Yes, we’ve come a long way since then, like no longer needing to type BBCode to get our emoticons to work. But what else?

April 2004

Ah, the famed gmail, where AJAX (as it would come to be named) would start changing the game. Yes, this was a decade ago, and quickly thereafter we started seeing realtime chat rooms and even Google Docs (Writely then) show up before even 2006.

Hardware was horrible, but software worked.

Give or take, this is roughly the hardware that I used back then. It sucked, and that is why I’ll hail software between 2004 ~ 2008 as the greatest ever experienced. Because it worked, I could actually instant message somebody, over dial up (yes my internet sucked too), in the browser without the web page eating up a gig of RAM.

Approximately 1/26th the size of all of Wikipedia.

Because, yes, no joke, sometimes gmail will now eat up that much memory now a days. This isn’t even photos like on Facebook, this is just plain old text based email. I could run Crysis at the time on hardware with less than 2GB of DDR3 RAM, simulating an entire tropical island in mesmerizing detail. Maybe we should blame Chrome for the choke?

So what has changed since then?

We got sticky headers. Plus a lot of bloated javascript frameworks that take forever to load, and websites that break without them. What happened to the promise of Web 2.0 that gmail circa 2004 prophesied?

The realtime web of then managed to work in IE6, but today web apps often break unless you’re running the latest Chrome. However, what went wrong is a systemic issue that originates from far earlier than half a decade ago.

It goes deeper than you think.

In 2009, a lone mathematician of insurmountable genius released something that would change the way nearly all developers would think about servers. Then in a sudden turn of events, he vanished in late 2011. Leaving behind only a cryptic rant of the status quo in software.

In those last 4 years, developers have slowly been waking up. Discovering that all the layers of convenience built into programming languages and databases over the last 30 years are actually getting in the way. They hide the very primitives that your app needs to build upon.

Frameworks can never be Turing Complete.

Behind the mask lies inextricable complexity that now make our 2014 machines slow. Because every subsequent generation of programmers are told “Don’t solve X, it is hard, we made you Framework Y instead”. We aren’t running a 2014 machine, we’re running a 2014 machine that emulates a 2011 computer, which simulates a 2008 tool set, that relies on 2004 technology, which was built for 2001, which was just an iteration over 1997.

Do not misquote me, I am not saying (as is popular now a days) to “Build your App to work on the Browser of Tomorrow”. In fact, that is the exact opposite of what I am meaning. The browser of tomorrow is that utterly obfuscated system that relies on $LD_LIBRARY_PATH and all the nasty mutable config files of your operating system.

Dawn of a Reborn Era

We have hit the peak of where procedural components can take us, we have suffered the race conditions and now admit to its insufficiency. Instead, we see the light of single threaded processes, which react functionally to streams of immutable events. NodeJS restarted this trend, and its ideology is spreading.

That lesson in exposing the raw pipes, whether it be a socket from the browser or a file descriptor, is transforming the definition of necessary. From that, golang was born, heralding that all you need is some low level primitives and communication. That the way to play with parallelism and concurrency is not to hide it, but expose it. Those virtuous enough to use such constructs, shall be empowered with a distributed system.

Graph database with push notifications.

Powered with these concepts, we set out to apply it to databases. Thus creating gun, a realtime distributed graph database. Imagine a decentralized open source version of Firebase that runs on phones, browsers, and your servers. All interconnected, chainable, and reactive to each other. Just like the web should be, idempotent streams of data to any client that can GET.

Forget the painful databases that were based on procedural, mutable, centralized ideas. They come with the same flaws of the languages that are now being abandoned. Rife with complex logic that tries to coordinate globally consistent state via some delicate sharding and consensus algorithm, yet always failing because no amount of complexity can mask network partitions. Immutability is on the rise.

Even the gods of hollywood are paying attention.

But it is not stopping there. People are now applying these concepts to the Operating System itself, loosely in the forms of Docker and CoreOS. And articles like this have gotten upvoted to the top of Hacker News. Even more recently, one project is so crazy it aims to put a bullet into the systemic issues for good, by rewriting everything since the 1970s from scratch. Maybe we really are at the dawn of a better era of computing.

A Vision for the Future

Finally, I would like to summarize my thoughts by musing about an ideal system. Leaving on the touches of design and user experience that Ryan’s rant illuminated.

First, javascript that runs on the hardware level, despite this being satire. Not because javascript is a good language, that is hardly arguable, but because it is the lingua franca of the web. What does make javascript good, though, is that it is event driven as a UI language. Plus, stateless communication with other machines is its bread and butter.

Coupled together, these two things push a very healthy dose of distributed logic. With that comes some incredible opportunities, like near endless storage and practically infinite processing. Do not confuse this with the cloud, where the NSA does all the work for you and your device is the dumb client. Everything instead is localized, but as boundaries are hit they slowly overflow and offload onto your other nearby devices or the clouds you trust.

Take for example your phone, if that movie you are watching can’t fit in RAM, it overflows to disk. If it can’t fit on disk, it overflows to your laptop nearby, if you have too many videos on that hard drive, it overflows to the cloud. Same with CPU, doing some intensive photo editing on your phone? Some of it gets distributed to your laptop so that way you can still answer texts from your phone. Doing a massive computation? Bleed into the cloud.

Every device focuses on its immediate context, to give the user the fastest experience possible. Like what gmail did in 2004, even on horrible hardware. Automatically pushing old contexts further away, and shifting in more important contexts before the user can even finish tapping. But gosh darnit, without choking my entire browser like Chrome does.

Second, abolishing apps as tools. Apps should not be wall gardened, packaged silos. Instead they should be feature sets that functionally react to streams of data, that can be used in any experience. Users have to fragment their minds like a computer, when they want to do social networking, they have to look for the “social networking” app, when doing photo processing, look for the “photo processing” app, and so on. This is detrimental.

Imagine instead, an operating system that adapts to the behavior of the user. When I start typing, anywhere, it should evolve into a word processor. If I mention my friend’s name in a titular form, it becomes an email or text. When I start talking to my phone, it dials that contact — because I want to speak with humans not siri— recording my comments as a voice message unless they answer. When I take a picture, it transforms into a photo editor, and so on. Give the user what they want, as well as alternatives.

This is not about the interface shape shifting, because that would be confusing for the user and difficult to achieve without icons from apps. This is about removing the visual interface entirely and embracing the inputs on devices, be it touch, mouse, keyboard, audio, camera, LEAP, MYO, or other. These are physically the only way we can interact with computers, so why don’t we use them more?

A keyboard can do more things than type letters, on a piano they produce notes. As words are to qwerty clacking, music is to the composition of notes. Cameras are also light sensors, so why don’t they adjust brightness? At this rate, it is mind blowing that phones self adjust their orientation based on the gyroscope. The interface is in the interaction, not in the menu or button.

In conclusion, these are my observations as of late, my peering conjectures of the coming future, and the superlative end of my research. As in life, as is data, my work on Accelsor shall eventually consist with those ideals. And in time, the only resource I have, is to immutably choose to emit events towards synchronizing with that state by solving the problems that exist now. Till then, you and I are the only immutable operating systems. The paradigm in which our minds operate, the systems in which we partake and live in, are being recorded in an immutable append-only log called history.

p.s. so emit beautiful choices, like meeting me at JSFest Oakland, December 12th — I’m doing a talk! Let’s make API calls to each others’ minds and ❤s.

--

--

Mark Nadal

autodidactic philosopher, entrepreneurial visionary, wanderlust logician.