Edge Computing

Sushmey
Developer Students Club, VJTI
5 min readJan 28, 2022

Imagine that you live in a world full of data. It is 9 AM on a Monday morning, you ask your Google Home, “What’s the traffic on the way to work today?”, it takes a second and replies, “To drive from home to work, based on current traffic, it will take 45 minutes.”, You’re late! You decide to pack your breakfast to eat it in your autonomous car. The car manages to calculate a faster route and you arrive at your office in mere 30 minutes. How was it able to process so much data, so quickly and all in real-time when your Google Home took a second to calculate the time to drive to work?

Before that, let’s talk about what happens to your questions, or queries that you ask to your smart devices. The questions go to a secure cloud server where it processes the question and sends back an appropriate answer. But in time-sensitive situations like driving an autonomous car where you need an immediate response, how do you manage to reduce the time to aggregate and process the data?

If only there was a way to always be close to the cloud so that you can process your information right then and there! What if we bring the cloud to our device? Not exactly the cloud but something similarly powerful, computing data at the edge?

This article will serve as an introduction to edge computing and is just the tip of the iceberg.

What is edge computing?

IBM defines edge computing as -

“a distributed computing framework that brings enterprise applications closer to data sources such as IoT devices or local edge servers.”

Edge

Edge devices are physical hardware located in remote locations at the edge of the network with enough memory, processing power, and computing resources to collect data, process that data, and execute upon it in almost real-time with limited help from other parts of the network.

Edge + Computing

So basically what IBM is saying is that edge computing is computing that takes place at or near the physical location of either the user or the source of the data. Way to complicate things, IBM.

What is edge computing? Picture courtesy: IEEE

Why use edge computing?

You could argue “Why do you have to process information where it’s generated? Isn’t it cheaper to send it to the cloud where you can use shared IT resources over a network?”

Yes.

In a cloud computing model, compute resources are accessed by end-users at the edge of a network. This model has proven cost advantages and has more efficient resource sharing capabilities because you don’t have to pay an exorbitant amount or maintain your own server farm just to get a faster CPU.

However, in most models which charge based on how much you use, or how many requests you make, new forms of end-user experiences like IoT which generate a huge amount of data can rack up huge costs or worse, dial-up loading speed.

It also results in bandwidth issues and latency delays since everyone is sending terabytes of raw data to the cloud.

For example,

Edge computing architecture for autonomous vehicles. Picture courtesy: https://bit.ly/3r4toc7
Edge computing architecture for autonomous vehicles. Picture courtesy: https://bit.ly/3r4toc7

Consider your autonomous vehicle from earlier, they will depend on intelligent traffic control signals. Cars and traffic controls will need to produce, analyse and exchange data in real-time. Multiply this requirement by huge numbers of autonomous vehicles and the need becomes obvious. This demands a fast and responsive network that can be executed by providing a way to process and aggregate data at the point of generation.

Edge computing acts as a local source of processing and storage for the data and computing needs of IoT devices, which reduces the latency of communication between IoT devices and the central IT networks those devices are connected to. Plus, it allows you to benefit from the large amount of data created by connected IoT devices. Deploying analytics algorithms and machine learning models to the edge enables data processing to happen locally and be used for rapid decision making.

But now you’d wonder “What about security? What if someone hacks into one of the edge devices?”

Let’s say you are a hacker, you hack into one of the edge devices and now can see the data that’s being processed in it. The data would be very niche and specific since it’s only a fraction of the whole thing. Meaning, even if someone does manage to get through the security, the data wouldn’t be of much use.

On the other hand, the endpoint data stored in data centre servers tends to be combined with other data points that then create a more complete collection of information that hackers could use for nefarious purposes.

For example, consider edge computing in a healthcare setting. Sensors collect a patient’s vital signs, which are then analysed by an edge computing device. That device only holds those readings.

However, if the endpoint sensors send the data back to servers where it’s stored with other information, including personally identifiable information about the patient, and that information is hacked, then that patient’s privacy is compromised.

Where is edge computing used?

Principally, edge computing techniques are used to collect, filter, process and analyse data “in-place” at or near the network edge. It’s a powerful means of using data that can’t be first moved to a centralised location — usually because the sheer volume of data makes such moves cost-prohibitive or technologically impractical.

Edge computing is also reliable and resilient in situations where communication channels are slow or unavailable.

For example,

Satellite imagery — like the kind being used on the International Space Station (ISS) — is an edge computing implementation. Edge devices physically located on the ISS are running containerised analytical code nodes that connect to the cloud on Earth. Only images that are worth transferring are sent down to the ground because the sheer volume of data collected is too much to send to an Earth-based cloud.

Even in a hyperconnected world of ours, not every place has a way of connecting with the omnipresent cloud. For example, an energy company with edge computing deployments on an oil rig doesn’t have to constantly rely on an available satellite connection to relay data back to a data centre for processing; it can opt instead to move only the necessary processed information from the edge back to its data centre when the connection is available.

Looking forward

The increase in the number of IoT devices and the advent of 5G will result in generation of an unprecedented amount of data.

Gartner, a technology research and consulting company, estimates that by 2025, 75% of data will be processed outside the traditional data centre or cloud.

Edge computing tackles a growing demand to address lower latency, process the growing amount of data on the edge and support resilience to network disconnection.

--

--

Sushmey
Developer Students Club, VJTI

Hi! I am Sushmey, a pre-final year Information Technology student studying at VJTI, Mumbai.