Towards an Internet of Living Things

Conservation groups are using technology to understand and protect our planet in an entirely new way.

Photo by Ari Friedlaender from an expedition using multi-sensor approach to measuring the foraging behavior of Minke Whales in Antarctica.

The Internet of Things (IoT) was an idea that industry always loved. It was simple enough to predict: as computing and sensors become smaller and cheaper, they would be embedded into devices and products that interact with each other and their owners. Fast forward to 2017 and the IoT is in full bloom. Because of the stakes — that every device and machine in your life will be upgraded and harvested for data — companies wasted no time getting in on the action. There are smart thermostats, refrigerators, TVs, cars, and everything else you can imagine.

Industry was first, but they aren’t the only. Now conservationists are taking the lead.

The same chips, sensors (especially cameras) and networks being used to wire up our homes and factories are being deployed by scientists (both professional and amateur) to understand our natural world. It’s an Internet of Living Things. It isn’t just a future of efficiency and convenience. It’s enabling us to ask different questions and understand our world from an entirely new perspective. And just in time. As environmental challenges — everything from coral bleaching to African elephant poaching— continue to mount, this emerging network will serve as the planetary nervous system, giving insight into precisely what actions to take.

It’s a new era of conservation based on real-time data and monitoring. It changes our ecological relationship with the planet by changing the scales at which we can measure — we get both increased granularity, as well as adding a truly macro view of the entire planet. It also allows us to simultaneously (and unbiasedly) measure the most important part of the equation: ourselves.

Specific and Real-Time

We have had population estimates of species for decades, but things are different now. Before the estimates came from academic fieldwork, and now we’re beginning to rely on vast networks of sensors to monitor and model those same populations in real-time. Take the recent example of Paul Allen’s Domain Awareness System (DAS) that covers broad swaths of West Africa. Here’s an excerpt from the Bloomberg feature:

For years, local rangers have protected wildlife with boots on the ground and sheer determination. Armed guards spend days and nights surrounding elephant herds and horned rhinos, while on the lookout for rogue trespassers.
Allen’s DAS uses technology to go the distance that humans cannot. It relies on three funnels of information: ranger radios, animal tracker tags, and a variety of environmental sensors such as camera traps and satellites. This being the product of the world’s 10th-richest software developer, it sends everything back to a centralized computer system, which projects specific threats onto a map of the monitored region, displayed on large screens in a closed circuit-like security room.
For instance, if a poacher were to break through a geofence sensor set up by a ranger in a highly-trafficked corridor, an icon of a rifle would flag the threat as well as any micro-chipped elephants and radio-carrying rangers in the vicinity.

These networks are being woven together in ecosystems all over the planet. Old cellphones being turned into rainforest monitoring devices. Drones surveying and processing the health of Koala populations in Australia. The conservation website MongaBay now has a section of their site dedicated to the fast-moving field, which they’ve dubbed WildTech. Professionals and amateurs are gathering in person at events like Make for the Planet and in online communities like It’s game on.

The trend is building momentum because the early results have been so good, especially in terms of resolution. The organization WildMe is using a combination of citizen science (essentially human-powered environmental sensors) and artificial intelligence to identify and monitor individuals in wild populations. As in, meet Struddle the manta ray, number 1264_B201. He’s been sited ten times over the course of 10 years, mostly around the Maldives.

A different Manta Ray. Struddle is here:

The combination of precision and pervasiveness means these are more than just passive data-collecting systems. They’re beyond academic, they’re actionable. We can estimate more accurately — there are 352,271 elephants estimated to remain in Africa — but we’re also reacting when something happens — a poacher broke a geofence 10 minutes ago.

The Big Picture

It’s not just finer detail, either. We’re also getting a better bigger picture than we’ve ever had before. We’re watching on a planetary scale.

Of course, advances in satellites are helping. Planet (the company) has been a major driving force. Over the past few years they’ve launched hundreds of small imaging satellites and have created an earth-imaging constellation that has ambitions of getting an image of every location on earth, every day. Like Google Earth, but near-real-time and the ability to search along the time horizon. An example of this in action, Planet was able to catch an illegal gold mining operation in the act in the Peruvian Amazon Rainforest.

It’s not just satellites, it’s connectivity more broadly. Traditionally analog wildlife monitoring is going online. Ornithology gives us a good example of this. For the past century, the study of birds have relied on amateur networks of enthusiasts — the birders — to contribute data on migration and occurrence studies. (For research that spans long temporal time spans or broad geographic areas, citizen science is often the most effective method.) Now, thanks to the ubiquity of mobile phones, birding is digitized and centralized on platforms like eBird and iNaturalist. You can watch the real-time submissions and observations:

Sped up, we get the visual of species-specific migrations over the course of a year:

Occurrence map of the Red-headed Woodpecker. Source: eBird

Human Activity

The network we’re building isn’t all glass, plastic and silicon. It’s people, too. In the case of the birders above, the human component is critical. They’re doing the legwork, getting into the field and pointing the cameras. They’re both the braun and the (collective) brain of the operation.

There are benefits to keeping humans in the loop. It’s allowing these networks to scale faster. Birders with smartphones and eBird can happen now, whereas a network of passive forest listening devices would take years to build (and would be much more expensive to maintain). It also makes these systems better adept at managing ethical and privacy concerns — people are involved in the decision making at all times. But the biggest benefit of keeping people in the loop, is that we can watch them—the humans—too. Because as much as we’re learning about species and ecosystems, we also need to understand how we ourselves are affected by engaging and perceiving the natural world.

We’re getting more precise measurements of species and ecosystems (a better small picture), as well as a better idea of how they’re all linked together (a better big picture). But we’re also getting an accurate sense of ourselves and our impact on and within these systems (a better whole picture).

We’re still at the beginning of measuring the human-nature boundary, but the early results suggests it will help the conservation agenda. A sub-genre of neuroscience called neurobiophilia has emerged to study the effects on nature on our brain function. (Hint: it’s great for your health and well-being.) National Geographic is sending some of their explorers into the field wired up with Fitbits and EEG machines. The emerging academic field of citizen science seems to be equally concerned with the effects of participation than it is with outcomes. So far, the science is indicating that engagement in the data collecting process has measurable effects on the community’s ability to manage different issues. The lesson here: not only is nature good for us, but we can evolve towards a healthier perspective. In a world approaching 9 billion people, this collective self-awareness will be critical.

What’s next

Just as fast as we’re building this network, we’re learning what it’s actually capable of doing. As we’re laying out the foundation, it’s starting to come alive. The next chapter is applying machine learning to help make sense of the mountains of data that these systems are producing. Want to quickly survey the dispersion of arctic ponds? Here. Want to count and classify the number of fish you’re seeing with your underwater drone? We’re building that. In a broad sense, we’re “closing the loop” as Chris Anderson explained in an interview:

If we could measure the world, how would we manage it differently? This is a question we’ve been asking ourselves in the digital realm since the birth of the Internet. Our digital lives — clicks, histories, and cookies — can now be measured beautifully. The feedback loop is complete; it’s called closing the loop. As you know, we can only manage what we can measure. We’re now measuring on-screen activity beautifully, but most of the world is not on screens.
As we get better and better at measuring the world — wearables, Internet of Things, cars, satellites, drones, sensors — we are going to be able to close the loop in industry, agriculture, and the environment. We’re going to start to find out what the consequences of our actions are and, presumably, we’ll take smarter actions as a result. This journey with the Internet that we started more than twenty years ago is now extending to the physical world. Every industry is going to have to ask the same questions: What do we want to measure? What do we do with that data? How can we manage things differently once we have that data? This notion of closing the loop everywhere is perhaps the biggest endeavor of our age.

Conservation has long been concerned with protecting our natural resources. Finally, we’ve got the tools to understand what that truly means. This may just be the warm up act, too. New technologies are allowing us to move beyond bits and into biology. Tools like eDNA and handheld DNA/RNA analyzers mean we’re able to add another level of resolution to our monitoring. Adding technologies like CRISPR and gene drives mean we’ll be able to respond and affect our environments in even more dramatic fashion. This isn’t science fiction. It’s already happening. Putting even more urgency on building and understanding this new planetary nervous system.