The IoT has the greatest potential to advance society since the Industrial Revolution: a world where all kinds of things are interconnected, smart, communicating, and improving our quality of life.
Our future is connected and smart world. Exponentiated hardware capabilities allow for an explosion of connected devices. We are seeing reduced power consumption and increased computing power.
Koomey’s Law is the unleashing of the Internet of Things. The idea is that, at a fixed computing load, the amount of battery needed falls by a factor of two every 18 months. This enables wide-scale proliferation of mobile and miniaturized computing, sensing, and telemetry applications, facilitating the Internet of Things which in turn drives growth of data processing and content management in the data center.
We can only go as fast as Moore’s Law and Nielsen’s Law allow us. comparing the two Laws shows that: Bandwidth grows slower than computer power.
Moore’s Law says that computers double in capabilities every 18 months: this corresponds to about 60% annual growth. As shown in the table, bandwidth will remain the gating factor in the experienced quality of using the Internet medium.
Connected ≠ Smart
The future focus will be on connecting devices, software and people. Leveraging AI + IoT to create intelligent automation agents that can monitor, learn about and control systems.
Just because something is connected, doesn’t mean it’s smart. For example, WiFi is a connector, but doesn’t mean that it’s smart. It’s what you do with that connectivity, so that things are aware of the other things around them. They can talk, communicate and interact with things around them and then learn, and use that knowledge/data. That is a smart future. But, there are many hurdles we have to get through before we see this. (Where does data get pre processed before cloud, where data get shared with whom?)
Data is what we will build business off of and provide services from. There will be economic value creation from the data they generate. The average internet user uses 1.5 gigabyte of data per day. That data is worth a lot and is only on the rise. Especially when we think about how access to the internet, connectivity and devices on the market are increasing.
The GSM Association’s Connected Life predicts that by 2020, there will be 24 billion connected devices, while Cisco’s current Internet of Everything prediction is 37 billion “intelligent things,” such as cars, appliances, smartphones, tablets, monitoring sensors and more, connected to the Internet.
“The value of a network is proportional to the square of the number of connected users of the system.” -Robert “Bob” Metcalfe
Metcalfe’s law demonstrates the value of interconnectedness. With 5 devices, there are 10 possible connections, but with 100 devices, there are 5000 connections. The number of connections grow quickly, as does the economic value derived from connections. Interconnectedness is the software — it’s a hard problem to solve in IoT. That’s why we need developers.
With IoT, we are basically exposing a new platform. Here’s an analogy: Before smart phones, regular phones has sensors, speakers, mics, and GPS — yet no one did anything but speak. Then, we put an operating system on top of the same hardware. Now the hardware functionality is exposed as a set of APIs where devs can do what they want. We exposed a new platform, and IoT is doing the same thing.
Ok, bear with me. So we already learned that Connectedness ≠ smart, now, let me tell you that connected ≠ interconnected, but interconnectedness = Interoperability.
Ok so then what’s the difference between connected things versus interconnecting things? Interconnecting devices themselves are a powerful concept and result in economic value. When looking at the value, it’s the number of connections devices create, not the number of devices themselves. Interconnectedness drives economic value.
In the McKinsey 2015 IoT study, they report that the total economic value to be generated by the IoT in 2025 is $11.1 Trillion. 47% of the value — $5.2 Trillion — will be unlocked by Interoperability.
We overestimate IoT now and in the next few years, while we well underestimate the future of IoT in 5–10 years once some of the hurdles have been addressed. There’s a lot of work that needs to be done, and we need developers. We need developers to help us harmonize on data models, on APIs, on device classes across industries.
We can’t keep creating more fragmentation, where everyone is doing their own thing differently. We must also be inventive and think about how we can grandfather in. How do we deal with old devices with constraints? How do we patch?
The Zettabyte Era
We are nearly in the Zettabyte Era. Global IP traffic will reach the Zettabyte threshold in 2016. 1 Zettabyte (ZB) = 1024 Exabytes (EB). Wherein 1 Exabyte = 1 million Terabytes (TB) of data.
- 72 hours of videos are uploaded to Youtube
- 204 million emails are sent
- Google receives over 4 million search queries
- Facebook users share close to 2.5 million pieces of content
- Over 300,000 photos are being shared through whatsapp
- Tinder users swipe more than 416,667 times
Cisco has interesting trends and analysis for thresholds reached by 2020, and it’s hard to wrap your head around.
- Global IP traffic has increased five fold over the past 5 years and will and will reach 2.3 ZB per year
- The number of devices connected to IP networks will be more than three times the global population
- There will be 3.4 networked devices per capita by 2020. Smartphone traffic will exceed PC traffic
Get your head out of the clouds!
How did it get it’s name? It may sound silly, but Fog is basically a cloud that is closer to ground. Fog computing is an extension of cloud computing that is:
- Adding process and memory resources to Edge devices
- Pre-processing collected data at the Edge
- Sending aggregated results to the cloud
Matt Newton, director of technical marketing at Opto 22 (a manufacturer of controllers, I/O, relays and software for linking devices to networks) explained that:
“fog computing involves pushing intelligence and processing capabilities down closer to where the data originates” from pumps, motors, sensors, relays, etc.
Why “Fog”? Fog allows for processing data locally in smart devices, as opposed to sending data to the cloud for processing. It is a model in which data, processing and applications are concentrated in devices at the network edge rather than existing almost entirely in the cloud.
As the number of connected devices worldwide continues to skyrocket, the amount of data being generated grows exponentially! We could continue to build server farms in order to keep up, but this can only work for so long. With Fog Computing, the problem is eased by allowing smart devices to be smart. Smart devices, as opposed to a primitive sensor, allow the handling of the interpretation logic rather than requiring a trip to the cloud.
The Internet of Things (IoT) is generating an unprecedented volume and variety of data. But by the time the data makes its way to the cloud for analysis, the opportunity to act on it might be gone.
What it takes to develop for IoT
The real trick in an Internet of Things product is moving data in an efficient and fast way — so at the heart of any IoT implementation is the API. Done correctly, it can simplify your code, speed up development, and if you ultimately deliver data seamlessly to the user, it can actually enhance your user’s experience. You can think of the API as the product, and the developer is the customer.
Device people and software people don’t understand each other. For the device folks, the API is the product and the app developers are the primary consumers. When building APIs for devices, you need to understand the needs of the consumers in terms of design and the prefered protocols that mimic present web architecture.
REST and JSON APIs enable software engineers to never reinvent the wheel when building new apps. REST is the architecture of the web, and if you’re going to be building applications on the web — shouldn’t you work with the architecture not against it?