Rethinking the information revolution
Written with Alex Flint
Beyond all the needs that it fulfils, all technological innovation is underpinned by a common driving force: how to make information flow more efficiently. From when the first modern humans walked the earth, we’ve assumed that it was their survival instinct that drove innovation. It certainly has, but we forget that without the ability to efficiently pass on information from one generation to the next, our ancestors would’ve had to reinvent the most basic things every time they needed it.
From the beginning of human civilisation till today, our aim has been to increase, what can be termed, brain to brain bandwidth. The idea encompasses not just flow of information from one person to the other but also how effectively it is transmitted, that is how well it is understood or used by the person receiving it.
We’ve come to associate the last 50 years with the period when the information revolution took place. But that is because the industrial revolution that preceded it made life easy enough for us to focus primarily on information and its transmission. Is the information revolution slowing down though? Certainly not.
The machine of the dreamers
The personal computer was expected to make its way into every home well before the 1990s. But its limitations with speed and memory did not let that happen. Its main users for many years were technology geeks, nerds and hackers.
While no one doubted the achievement of Apple I from a purely technical standpoint, giants of the field like IBM did not believe in the dream of the PC-enthusiasts. In 1976 it was hard to imagine how exactly an abstruse gadget in a wood-casing with the title “Apple I” scrawled over the headpiece would have a large impact on ordinary life. But should it have been so difficult? The fundamental role of information in our lives seemed to have been underplayed.
By the time the personal computer, as we know it*, was first built, it had already been over a decade since Gordon Moore’s prediction that the number of components on an integrated circuit would double every two years^.
A general purpose information processing device was going to be in demand and would become cheap enough for many to afford. But it still took a genius and a rebel like Steve Jobs to force the incumbents to accept that the PC age had begun.
The byproduct of science
The next innovation after the PC that had a comparable impact on humanity’s brain to brain bandwidth was the internet. What the PC made possible was a better way to access and manipulate information. The advent of the internet brought things a step further by enabling us to connect such information with relative ease.
However, like the PC before it, mass adoption took time. After being invented as a means of transferring data between physicists, Sir Tim Berners-Lee’s idea took off in the mid-90’s. Since then the internet has disrupted not just information transfer mechanisms but many other markets. From the postal system to the education system, anything that has information transfer at its heart has been changed by the internet.
The rise of social
While many might dispute social media as the next big innovation, there is little doubt that adding a personal touch to information flow has made a huge difference. Defined as a website that allows you to make a profile page, connect with friends and view your friends’ connections, the first social networking website was SixDegrees.com launched in 1997.
Since then, of course, social networking sites like MySpace, Orkut, Facebook, Twitter and, most recently, Google+ have drawn hundreds of millions of users. Even though Facebook is not quite worth $100 billion just yet, the sheer number of users of Facebook has helped it create a parallel world of its own on the internet. Just a little less than half the world’s internet users have Facebook accounts. It’s not just Facebook and Twitter though. Social news sites like Reddit, Digg and StumbleUpon draw large crowds too.
But innovation in this sector is reaching a plateau. All social networking websites have essentially the same features: profiles, news feed, data-sharing (photos, links, documents, etc.) and many ways of bringing users together in groups or by direct communication. We’ve reached a point today when people are spending less time on social networks than before.
Virtually face to face
The next innovation needed in increasing our brain to brain bandwidth are being touted to come from wearable computing, be it smartwatches or products like Google Glass. But these seem like an incremental development rather than one that is paradigm-shifting.
What we really need is a virtual way to replicate the water-cooler effect. The effect is named after the phenomenon that colleagues in an office meet at a water-cooler, which leads to serendipitous exchange of ideas. It is thought that the internet has led to the decline of these chance events happening, and thus slowed down the pace of innovation.
It was this that formed the core of a recent note from Marissa Mayer, Yahoo’s CEO, that asked Yahoo employees to stop working from home. Many decried Mayer’s note, calling her out of touch with reality. But she has a point because there is a lot of value in face to face communication. No innovation yet has come close to solving that problem.
A solution to this problem will truly impact the world. Economists have found out that the easiest way to double world GDP is to get rid of international borders. Which, of course, is a politically implausible proposition. But if technological development could allow virtual presence of a person to be nearly as good as real presence, this dividend would not remain an unrealised one.
And perhaps Yahoo workers could start working from home again.
* Many will dispute which exactly was the first personal computer. Perhaps it was GENIAC built in 1955. The Apple II built in 1977 was the first mass-produced PC. But the first PC with a graphic user interface, that we have become so accustomed to, was Lisa built in 1983.
^ The often-quoted period of 18 months was a modification by David House, of Intel, who said the growth in computing power will come not just from more transistors but also from faster ones