It’s undeniable that we are entering a new era of computing. Whether you call it the Internet of Things, the Third Wave, the Fourth Wave, distributed computing, whatever it is, the technology landscape and the machines around us are starting to look different.
Plenty has been written about this shift but one aspect that is often overlooked is how interfaces evolve as we move into new technical eras. An interface, for the purposes of this post, refers to how humans interact with machines. You use different interfaces every day, whether you realize it or not. As the hardware and software that surrounds us change to embrace this new generation, new interfaces will be born.
Early computers did not have monitors that displayed the output of the machines. Instead, users would enter commands into a terminal, known as a teletype, and the results would get printed out on paper. Yes, real paper. Eventually, as computers became more mainstream, new interfaces like blinking lights, the monitor, and the mouse were born to allow humans to better interact with machines.
The 1990s brought a new era to the personal computer space, one where actions taken on a machine often depended on the state of a different machine across the network. This era introduced concepts like the World Wide Web and it undoubtedly changed the world of computing as we know it. It also meant that our interface for machines had to change. We needed a way to visualize and easily digest the state of a different machine and, as a result, the web browser was born.
Eventually, we entered the next generation of computing — mobile. This led to a plethora of new interfaces from touch screens to responsive design to fingerprint sensors that allow us to interact with these tiny machines that we carry in our pockets.
Which brings us to today. We are in a new time. One where devices are more powerful than ever and contain their own application logic. One where there are so many connected things talking to each other that we, as humans, aren’t able to keep track…