Home Automaton Will Automate Us Right Out Of Home
Virtualizing real space, one Nest at a time
I’ve been at home more in the past month than at any other time in my adult life. In the process of trying to make an accidental sabbatical a purposeful one, I’ve gone from spending 3 waking hours a day in my apartment to something more like 12 (gym, coffees with startups doing interesting things, or a sneaky midday movie are necessary breaks).
This inordinate amount of home time has made me appreciate how absurdly under-utilized residential space is. For most working adults, and even for families with all but the youngest kids at home, the house is essentially useless (“idle” in computing terms) for much of the day.
These idle spaces are one of the largest stores of value in our economy. In the US, all homes combined are worth $26 trillion, surpassing the market cap of all public companies combined. Homes, and the land they’re on, are highly adaptable, valuable spaces that just sit there like sentimental lumps during the day.
(Of course, we have extraordinary attachment to the places we call home. The mere fact that the house isn’t occupied by strangers when we aren’t home accrues psychological value. In computing, A CPU is a CPU: I don’t feel dislocated when my work is executed in a remote datacenter the way I do staying in a generic hotel in an unfamiliar place. Technology has s strange way of recreating what it displaces in new forms, though, and I suspect our collective attachment to “home” will fade as more spaces, online and off-, can evoke the same emotions.)
Collectively and individually, it stands to reason that we would be motivated to get a whole lot more value out of this mostly idle asset.
Early computer engineers were faced with a similar problem.
Computing up until the early 60’s consisted of physically massive mainframes. IBM dominated the business, and their purposefully modular, lowest-starting-price machines ran a cool $250,000 ($1.8M in today’s dollars). These pricey, room-sized computers could only one run one program at a time. Bob ran one program to do something like tally up regional sales at 2pm. Then at 3pm, Mary could run her program to spit out production targets.
The fundamental constraint behind this inefficiency was simple: the computer only had a unified, expensive-to-maintain set of core resources. Programs used various commands to act directly on things like the CPU (asking for some math to be done), memory (storing intermediate results), and files (for saving the results of all that work). These resources were treated as a unified whole, to which a program had sole access as it ran.
In other words: early computers were a fantastically expensive, highly useful single physical resource that sat idle much of the time. Sounds familiar.
Along came the System 360 and its CP-67/CMS operating system, a watershed moment. Rather than implement some complex system to request, acquire, and release resources so that Mary and Bob could use the computer at the same time— a choice that would have required rewriting every program then in existence — the engineers chose a kind of sleight of hand: All programs would operate on a “virtual” version of the computer, able to just go about their ordinary business of changing files or tallying up numbers. A new layer of software*, what we would loosely call today the hypervisor, translated each program’s virtual requests into real ones, acting on the underlying resources efficiently, while exposing none of the messy scheduling, locking, and coordinating that the real computer needed to do this to Bob or Mary.
Virtualization, as this bundle of technologies is known, is crucial to enterprise computing and one of the key ingredients of rocket fuel for the current crop of startups. Abstracting away the high-cost real computer allowed, many iterations later, for services like Amazon Web Services to expose a very cheap, easy-to-develop platform on which most software you interact with today depends.
The leap was conceptual, not physical. Huge advances in the very physical aspects of computing like transistor density have of course contributed. Still, virtualization of compute resources has, simply by introducing new layers of abstraction over expensive physical resources, revolutionized our relationship to computation.
Virtualizing physical space could have an effect as radical on cities as virtualizing compute resources has.
What would that look like?
“Alexa, turn the kitchen lights blue.”
And they were blue.
My boyfriend looks only slightly amused. “So this is your new project?”
I am an absolute glutton for opportunities to explore new domains of gadgetry. I held off buying even the first Hue bulb until I didn’t have a full-time job, knowing it would become a consuming hobby.
As the Amazon shipping detritus piles up and the little white boxes plugged into every available outlet proliferate (Hue, Wink, Eve, and on and on), I may feel that I’m making the house more mine, more personal, more fun. Installing an open-closed sensor on the outside door provides a sense of security and effortless control of lighting on the deck. But every device also creates an entry in a computer-readable, software-modifiable model of my physical apartment.
With each smart sensor we add to our homes and offices, the physical world becomes more virtual. By seeing these early sensors as toys or gadgets, we are missing the opening up of a whole new layer of abstraction on top of physical space.
The ways humans use space are a kind of program: Cooking and serving a meal, for example, is a set of actions carried out on physical objects — some ephemeral, some fixed to the location. In the residential status quo, we insist on only running those programs on a computer we wholly own, even if it sits idle most of the time.
If most homes can be locked or unlocked programatically, secured and visually monitored automatically, accommodated to each individual or group inhabiting them and their preferences simply by presence, these spaces are no longer really a “house” in our current sense of the word. The spaces we call home are becoming a new kind of fungible resource altogether.
Airbnb is already chipping away at the “real” in real estate, executing vacation programs on shared infrastructure. Breather steps up the granularity, offering meeting rooms in underutilized parts of commercial buildings an hour at a time.
Imagine the potential of programmable, abstracted, virtualized space:
- A suburban driveway becomes a CSA distribution point
- A living room becomes a therapists’ office at a fraction of the cost
- The family with an infant is able to step into a private room for five minutes to change a diaper, no matter where they are in the city
- The pop-up lunch restaurant with eight seats that a chef uses to prove out her concept for a new restaurant, bringing that data to skeptical investors
- The YouTuber who needs to add variety to their shoots can find everything from raw loft space to backyard pools available for just a few hours
- Retail stores spring into existence for a few days in unused storefronts, restaurants closed during the day, or your empty garage, like Bulletin at another scale
I’m certain virtualizing “private” space will not happen all at once. We are heavily invested in the concept of single-user space, partially because it was such a critical component of the post-war consumer mythos of America that took the world by storm.
I am equally certain, though, that we are witnessing nothing less than the virtualization of real space. And I’m happily installing the destruction of my own concept of home, one smart bulb at a time.
*I am grossly oversimplifying here, engineer-friends. If this makes you mad, take that energy and spend it reading a fun, technical history of VM370