History of Wearable and Ubiquitous Computing

Abhinuv Nitin Pitale
3 min readJan 19, 2018

--

I am taking a course in Wearable devices under Prof Tom Martin at Virginia Tech. He started this course with a great introduction to the history of ubiquitous computing. I thought of writing this in form of a blog post as I was amazed by the astonishing accuracy of the predictions made by the stalwarts in this field.

The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.

This was stated in a 1991 draft for an article in the Scientific American by the father of ubiquitous computing, Mark Weiser who was then working at Xerox PARC (a company known for not being able to monetize on their amazing inventions.) This article is then followed by a journal publication in ACM in1993.

He starts of with the idea of having a truly ‘ubiquitous’ computers in different sizes starting ranging from ‘Post-It’ sized tabs to full size Boards. Weiser’s idea of a ubiquitous ecosystem involved having tabs, pads and boards which were context aware and were scalable. His inspiration for these kind of interfaces was by looking at an average white-collar programmer/computer/terminal operator job prevalent in those times.

Before I describe the devices he built, lets take a look at the ‘state of the art’ processor from 90s:

EV-4 (Alpha): 200 MHz with 8K RAM and 30W Power dissipation

Despite the obvious technological challenges, Weiser and his team at PARC built the tabs, the pads and the boards.

The tabs were like an identity badge with a display that could double up as a calendar and a diary. They had a touchscreen, three buttons (Does this not remind you of something you’re currently carrying in your pocket??) and ~2 weeks of battery life. The pads were palmtops like devices that were portable. The best of them all were the boards which could be interactively written on using electronic chalk.(This was really impressive considering the computing power and resources they had. Imagine rendering a HD display and using it as ‘touchscreen’ with just 8K of RAM and ~200MHz of clock speed.) They even had over the Atlantic shared link on these boards. (Once again, this was the time before the Internet had arrived.)

Even more surprising are the predictions he makes about the future of computing. He accurately predicts the rapid decline in display prices, the Moore’s law to help increase computing power, availability of cheap storage, cloud computing (he calls it networks that appear like disks), Android OS(micro-kernel OS) and emergence of cheap cellular networks.

He ends the Scientific American article with a fairly accurate example representing what the future might look like. It has a working professional lady who uses a digital assistant which interacts with her using speech recognition to ask for a cup of coffee and reads her the news.(Hey Alexa, looks like we found your mom.) The lady then uses live traffic routing to suggest her the best route (Google Maps anyone?) which is followed by a description of smart suite of office tools for meeting management(i.e. Outlook, Skype etc etc), and a multi-tasking, multi-windowed, multi-user application software for work (Team Viewer, ScreenShare, Windows OS and the list goes on..)

Weiser also cautions us about the possible impacts of this immersive reality on security and privacy.

This article and the corresponding work at PARC was truly way ahead of it’s time. I often wonder what might have been going on in Weiser’s mind right now? I definitely would have wanted to ask him about possibility of ‘Beam me up, Scotty!’

--

--