Abstracting Away Complexity: The Importance Of Software And Big Data In IoT
I have recently been reading Thomas Friedman’s book, Thank You For Being Late.
The whole idea of the book is to reframe the reader’s thinking on how both to understand and to take advantage of today’s “era of acceleration” in five crucial areas: work environments, politics, geopolitics, ethics, and everyday communities.
Freidman’s general thesis got me thinking about all the different ways technology has been accelerated over the past two decades — specifically in regard to its impact on making everyday objects “smarter.”
Even just through the rapid evolution of miniature sensors — everything from fire hydrants to cars, trashcans to the collar your dog wears, your toothbrush to your, well, self — software and machine learning has made it possible to understand the way these objects interact with the world.
In turn, this gives us (by “us” I mean the vast network robots and machines existing comfortably in the ether in a fit-to-be-a-Black-Mirror-episode sort of way) the information we need to make decisions based on predictions for the future, instead of reflections of the past.
Let me give you an example:
In Thank You For Being Late, Friedman writes:
“General Electric itself gathers data from more than 150,000 GE medical devices, 36,000 GE jet engines, 21,500 GE locomotives, 23,000 GE wind turbines, 3,900 gas turbines, and 20,700 pieces of oil and gas equipment, all of which wirelessly report to GE how they are feeling, every minute.”
That’s a lot of data. Like, a lot.
Friedman goes on to note that when all these sensors transmit their data to centralized data banks, then incredibly powerful software application begin to look for patterns. From the data, weak signals can be spotted long before they become strong ones, and problems can be addressed proactively instead of reactively.
Here are some easy ways of thinking about it:
PAST: Condition-based maintenance = “If it looks dirty, wash it.”
PAST: Preventative maintenance = “Change the oil in the car every 6,000 miles.”
FUTURE: Predictive/prescriptive maintenance = Look for patterns in the data and predict the exact moment when a component — whether it be a tire, a pharmaceutical drug in the supply chain, or an airplane part — will fail or need to be replaced.
Now, what do some of those FUTURE, predictive examples look like?
1. Smart Fire Hydrants
There are fire hydrants that continuously track water pressure, sending data back to local power utility companies to reduce the number of blowouts that occur.
2. Smart Garbage Cans
These garbage cans alert sanitation workers of when they are full to increase efficiencies on garbage routes.
3. German Dairy Farming
Microsoft helped create an IT-connected dairy farm, using sensor tags to monitor the health of the cows — without requiring employees to continuously monitor the barns.
These are just a few examples, but they showcase the different ways in which data is becoming increasingly ingrained into our lives.
Which brings me to where I believe all of this is headed:
When you look at these “smart innovations” and combine them with the blockchain, the old model of condition-based maintenance and “dumb alerts” will soon be out the window. It’s just not effective anymore to say, “This happened before, so it will probably happen again.” We’re past that point. Instead, software will crawl mass amounts of data and begin speculating on its own when a part or a tool needs to be cleaned, fixed, replaced, etc. There will be a shift to “preventative maintenance,” and it’s going to affect dozens of industries. Pretty nifty, eh?
A perfect case study or lens by which to view this phenomenon is the pharmaceutical industry, and all the different challenges they are facing with their supply chain. The way the pharmaceutical cold chain currently runs, sensors are placed on all pharmaceutical shipments, showing whether there has been a temperature excursion in transit (with a rather analog and binary approach: yes or no, red light or green light) — which is part of required compliance practices. The slightly-smarter sensors might tell a customer that there has been a temperature excursion, but not necessarily what to do with that info.
That’s a problem.
Not only a problem, but rather archaic if you ask me.
Furthermore, logistics providers, shippers, and packagers are the ones reading the data off these sensors. They don’t know what to do when they see a massive shipment of pharmaceutical drugs has been compromised. Do they send it back? Flag it and keep going? What about the paperwork that needs to be filled out? Who does that get sent to?
It’s an incredibly difficult process for all parties involved.
By combining IoT devices and sensors, these smart alerts can be drastically improved upon. This is something we are building at the company I co-founded, Chronicled: a smart alert platform that can operate in a more predictive and prescriptive way.
The last step here though is to collect all this data in a decentralized manner (a blockchain), so that the entire industry can work together to increase efficiencies, reduce error, and plan for excursions long before they occur. And the reason the blockchain plays such a vital role in all of this is because big data that is centralized and owned by big companies is what creates a monopoly of power.
And honestly, who wants to live in a monopolistic, centralized, Black-Mirror-worthy future?
This is why I’m intrigued by blockchains and decentralized data stores for IoT. The promise of interoperability without centralized monopolistic control is revolutionary, and will lead to countless innovative products and services built on top of blockchain-based data stores.