Week 2 ・ Reflection
I’m interested in the position of the designer of a future computing system similar to what Weiser describes, and the relationship between tools and user where the tools drop out of direct perception into a less perceptually-mediated experience.
I think Weiser is too quick to dismiss VR. Though in his day it was definitely isolating and if unchanged certainly not an ideal future, it nevertheless offers a powerful way to prototype the sorts of systems he only describes in words, and can give a test user a reasonable approximation of the world impossible with current material technology.
The difficulty of such a system is that it seems too wide and daunting for even the largest team of designers to fully preplan. I can imagine that such a deeply embedded system could learn directly from the users, personalizing itself informed by the multitude of data sources gleaned from such omnipresent sensing. I wonder how the user would perceive the system. Would they see it as a designed “thing” in the way every artifact up until now has been artificially selected? Or would they recognize accurately the natural selection that shaped the system into its current close fit with their lives?
Weiser’s concept reminds me of Bret Victor’s Dynamicland, a collaborative computing space developed in part with the participation of Xerox PARC’s own Alan Kay. It is a distributed computing system, with an arbitrary number of “screens” via projection mapped surfaces, where programs are physically instantiated on pieces of paper, and it supports a very humane interpersonal dynamic where all objects relevant for the computation are available on tables for collaboration and play.
He mentioned that the prototypes were present in a common, informal space. I see this opening them up to being available to test novel possibilities that might not have been arrived at through more staid user testing, but instead allowing organic experimentation through omnipresence.
