Living Data Streams for Humans and Machines

Reid Williams
IDEO CoLab Ventures
3 min readJan 17, 2017

The rise of IoT devices and big data demands new network infrastructure. One that allows data to flow seamlessly between publisher and subscriber — whether human or machine.

In 1991, Tim Berners Lee created and released HTML to the world. This moment marked the beginning of a new epoch of the internet (which at the time had already existed for some 20 years). With HTML, we got the world wide web: human-friendly hypertext documents that linked to each other, forming a web of human thought, scientific data, and conversation that quickly wrapped around the globe.

The early web was a frontier, with few fences and no superhighways, where one could roam into the country, set up home, and connect with neighbors. As the population of the frontier grew, joined by search engines, retailers, web apps, social media, video, and web 2.0, the protocol held things together and allowed the boundaries of the internet landscape to grow. There was room for everyone, even if the neon lights of the newcomers sometimes drowned out the simple homes of the first settlers.

Broadly shared standards like HTML are the sparks that create new territories ultimately leading to the culture, inhabitants, and businesses that form there. At the heart of this catalytic power is a simple technical contract: follow the rules of the protocol, but otherwise, do whatever you want.

The Decentralized Web: New Frontiers Require New Infrastructure

Today we stand at the center of a new frontier, with open territory that offers to be a place to build new kinds of homes and connect with new neighbors. This is the start of the decentralized web. Bitcoin lives here as does Ethereum, IPFS (the Inter Planetary File System), Blockstack, and a host of other projects.

As this space unfolds, Moore’s Law is allowing sophisticated computing to be built into cheaper and cheaper devices. Soon we’ll find that inexpensive, networked sensors lead to a flood of new data. As the amount of data rises and increasingly reflects the physical world we live in, we believe there’s a huge opportunity in systems that by default make it easy for data to flow between organizations and individuals in real time.

This is a world of living data streams, unhindered by GET requests or landlocked APIs, where real-time data and information flows freely between publishers and subscribers, where anyone, anywhere is free to access and remix data as they please. This, we believe, is how we harness the power of big data.

We’re building protocols that create a global but decentralized distribution network for data, where it is published in real time to any interested subscribers, and where nodes in the network can subscribe and publish value added analysis. In short, pub-sub for the decentralized web.

So, imagine contributing to climate change data sets by plugging in a Raspberry Pi at home, imagine detecting the spread of waterborne disease though a network of water quality sensors, imagine building a decentralized Twitter that really is a public utility.

The project is called Nomad, it’s just the beginning, and we’d love your help. Learn more at getnomad.io and at github.

--

--