Michael Rossiter
3 min readJan 10, 2016

Every Christmas, the Boston branch of my family, about 15 of us, does a Secret Santa. This year, I had the mixed fortune of getting my Uncle John. John is a very successful businessman and loves toys. Motorcycles, golf clubs, if he wants it, he’s got it.

So what do you get for the guy who’s got it all?

I went with a Google Cardboard headset.

Ok, so it was sort of one of those situations when you buy someone a present that you yourself want (my wife glumly received a Chromecast as part of her birthday present two years ago).

Once we got it working, the reaction was more or less like:

It was awesome. I hadn’t paid too much attention to VR and AR before, but now that I’m aware of it and glimpsed a vision of what it might offer, I’m seeing everywhere in articles and use cases.

This one (VentureBeat) in particular caught my eye.

“Nvidia said that VR takes seven times more graphics processing power than a typical PC game. Luckey said it was demanding because the graphics card has to render a 3D image for each eye and run it at 90 frames per second. Any slower, and the experience could make people motion sick.

“All of a sudden, a huge number of people are going to want a high end PC,” Luckey said.

At a given time, the configuration computing is determined by the relative benefits and costs of processing, storage, and communications. The key question is where to put the computing resources — locally on the user’s machine or on a remote server? Historically, Moore’s Law and the needs of the moment have pretty much applied to processing and storage consistently and the two have sat on the same side of the client-server divide.

So for example, when communications efficiency is high relative to hardware (storage & processing), you get a ‘thin-client’ architecture like we had in mainframe computing and have today. In this model, the user’s machine provides input/output and some local resources, but the majority of true computing happens remotely.

In contrast, when communications efficiency is low relative to computing, you get a ‘thick-client’ where the local machine performs both input-output and much of the actual processing and storage.

For the last 15+ years, one the dominant trends in computing has been the move to the cloud. We have been moving from ‘thick-client’ (desktop PCs, Web 1.0, and on-premise enterprise applications) to ‘thin-client’ (tablets, Web 2.0, and SaaS).

It will be fascinating to see how the emergence of VR and AR impact that trend. If VR/AR is widely adopted, it should be a boon for computer hardware manufacturers. Smartphones and tablets are ‘thin-client’ devices where key differentiators have been the software experience, screen size, and battery life. VR/AR adoption would drive a cycle where differentiators would be about processor performance. PC manufacturers haven’t seen a cycle like that since the ‘90s.

One wrinkle might be a ‘decoupling’ of processing and storage resources, with a requirement for more significant local processing of real-time data but relying on remote storage of key variables used to reconstruct a virtual world. This model has manifested in gaming, for example, where multi-player games send packets of information to players’ machines to keep everyone’s experience in-synch. I remember my glory days of X-Wing vs. Tie Fighter on the PC (1999–2000) when a ‘laggy connection’ (my 28K modem) was a huge disadvantage, even while I could play single player beautifully.

I guess what I’m really saying is that what I want for Christmas in 2016 is VR X-Wing vs. Tie Fighter

Michael Rossiter

DVx Ventures launches & scales game-changing businesses. dvx.ventures | All views my own or those of others who have convinced me of them.