Edge computing

ISA VIT
ISA-VIT
Published in
4 min readMar 24, 2020

-By Aarunya Paliwal

The internet is no longer a web that we connect to. Instead, it’s a computerized, networked, and interconnected world that we live in. This is the future.

-Bruce Schneier
(security technologist and author)

In recent times, the theory and actual practice of cloud computing has gained momentum. Computer systems, remote devices and appliances now come with this technology in-built. Explication of edge computing is the practice of processing data near the edge of your network, where the data is being generated, instead of in a centralised data-processing warehouse.

There has been speculation on the usefulness and user-friendliness of edge computing and its features. But the recent development in the fields of mobile computing and Internet of Things (IoT) technologies has made the world see the massive scope as well as its effectiveness.

Edge computing refers to the processing of data of the Internet of Things closer to where it is created. In this way, data is processed at the edge of the network, by performing analytics and knowledge generation at or near the source of data.

Edge computing works by pushing data, applications and computing power away from the centralized network to its extremes, enabling fragments of information to lie scattered across distributed networks of the server. Earlier available to only large-scale organizations, it’s now available to small and medium organizations because of the cost reductions in large scale implementations.

For instance, consider a building fortified with high-definition IoT closed-circuit television cameras. These cameras simply output raw video signal and continuously stream that signal to a cloud server. The cloud server at its end puts all video outputs into a motion detection application and only those clippings are stored in which some sort of activity is observed. Now, streaming such high definition signals uses significant bandwidth and also levy sizeable load on the cloud server that has to process the video footage from all the cameras simultaneously.

Now, putting edge computing and its ancillary technologies in place of the given scenario. If each computer has its own small-sized internal computer to run the motion detection on its own and send only the clippings where some sort of activity is recognised. This reduced usage of network bandwidth many folds. Also, this relieves the cloud server from a heavy load of processing videos, as a result, it can store relevant clippings of more number of cameras conveniently. Bringing the majority of the processing to the source of input is what is edge computing.

Another advantage of moving the processing to the edge of networks is that it reduces latency. Consider the time delay which comes into play when a remote device has to send signals to a far off cloud server. In comparison to that, if the device has its own internal processing unit, a lot of time and therefore money can be saved.

When one has to deal with a lot of data, you leverage IoT in such end-to-end ways or even in specific highly sensor-intensive and thus data-intensive environments whereby data is generated at the edge which by definition happens in IoT as your data sensing and gathering devices are at the edge. You inevitably encounter challenges on levels such as bandwidth, network latency, speed overall and so forth where primitive technologies play a role. In IoT applications with a mission-critical and/or remote component, the need for speed and for different approaches such as edge computing is even more important.

You need the aggregated and analyzed data, in the shape of actionable intelligence, enabling you to take actions and decisions, fast, whether these decisions are human or not. So, you don’t need all that data to store it and analyze it in the cloud but you only want that bit of data travelling across your networks.

You can imagine hundreds of scenarios where speed and fast data is key, from asset management, critical power issues, process optimization, predictive analytics to the real-time needs of supply chain management in a hyper-connected world, the list is endless.

You can also imagine that the more your building, business ecosystem and whatnot thrive on fast data and real-time holistic management in any broader context, the more valuable that data can become when properly leveraged and rapidly analyzed. We do live in times where having the right insights fast enough can have enormous consequences.

Speed of data and analysis is essential in many industrial IoT applications but is also a key element of industrial transformation and all the other areas where we move towards autonomous and semi-autonomous decisions made by systems, actuators and various controls.

That degree of autonomy is even at the very core of many of the desired outcomes.

References

  1. www.networkworld.com
  2. www.hpe.com
  3. www.enlightened-digital.com
  4. www.cloudflare.com

Originally published at https://blog.isavit.club.

--

--

ISA VIT
ISA-VIT
Editor for

(ISA-VIT) is a technically-oriented student chapter in VIT University. We provide a platform for budding tech-enthusiasts to acquire quality technical skills!