Cloud or Edge: Which suits the best for 5G network?

--

Cloud, or fog, computing was the first network infrastructure on the scene that enabled data storage over the internet and made it accessible from any device also connected to the internet. Cloud computing, in turn, enabled edge computing, which utilized the cloud concept, but brings the data closer to the end user, decreasing latency, improving speed, and creating wholly new opportunities for digital transformation across industries and our daily lives.

Is an edge part of a cloud?

Kind of.

Edge devices can contribute to a cloud, if the storage and computing capabilities provided by those devices at the endpoints of a network are abstracted, pooled, and shared across a network-essentially becoming part of a larger cloud infrastructure.

Edge computing is not part of a cloud. What makes edge computing so useful is that it is purposefully separate from clouds and cloud computing.

Here’s how we see it:

  • Clouds are places where data can be stored or applications can run. They are software-defined environments created by datacenters or server farms.
  • Edges are also places where data is collected. They are physical environments made up of hardware outside a datacenter.
  • Cloud computing is an act; the act of running workloads in a cloud.
  • Edge computing is also an act; the act of running workloads on edge devices.

An edge (location) is not the same thing as edge computing (action). Collecting data at the edge of a network and transferring it to a cloud with minimal (if any) modification is not edge computing-it’s just networking.

But, if that data is collected and processed at the edge, then it’s edge computing.

Edge computing is separate from clouds for 2 main reasons:

1. Time sensitivity. The rate at which a decision needs to be made doesn’t allow for the lag that would normally take place as data is collected by an edge device, transferred to a central cloud without modification, and then processed before a decision is sent back to the edge device for execution.

2. Data volume. The sheer volume of data collected is too much to send-unaltered-to a cloud.

What brings Edge computing in surface?

Edge Computing Vs. Cloud Computing — Which One’s Better?

First, it’s important to understand that cloud and edge computing are different, non-interchangeable technologies that cannot replace one another. Edge computing is used to process time-sensitive data, while cloud computing is used to process data that is not time-driven.

Besides latency, edge computing is preferred over cloud computing in remote locations, where there is limited or no connectivity to a centralized location. These locations require local storage, similar to a mini data center, with edge computing providing the perfect solution for it.

Edge computing is also beneficial to specialize and intelligent devices. While these devices are akin to PCs, they are not regular computing devices designed to perform multiple functions. These specialized computing devices are intelligent and respond to particular machines in a specific way. However, this specialization becomes a drawback for edge computing in certain industries that require immediate responses.

That was all about edge computing vs. cloud computing.

Where edge computing performance matters

Open opportunity

Some of the same container technologies that have become important for moving workloads between enterprise systems and the cloud will be employed for distributing computing to edge locations.

Real-time performance is one of the main reasons for using an edge computing architecture, but not the only one. Edge computing can also help prevent overloading network backbones by processing more data locally and sending to the cloud only data that needs to go to the cloud. There could also be security, privacy, and data sovereignty advantages to keeping more data close to the source rather than shipping it to a centralized location

The edge and the cloud coexist

In other words, functions best handled by the computing split between the end device and local network resources will be done at the edge, while big data applications that benefit from aggregating data from everywhere and running it through analytics and machine learning algorithms running economically in hyper scale data centers will stay in the cloud. And the system architects who learn to use all these options to the best advantage of the overall system will be heroes.

Cloud, edge, and 5G

5G refers to the fifth generation of telecommunications networks, representing upgrades in bandwidth and latency. 5G is a transport mechanism that enhances the capabilities of cloud computing and edge computing-but 5G is not the edge, an edge device, or edge computing. Mobile computing is also not the same thing as edge computing. Said another way, smartphone is (usually) not an edge device.

Edge computing and 5G — Aren’t they the same thing?

No! Although 5G and edge are sometimes used interchangeability, they really are different technologies and deliver different value for different consumer, retail, and industrial sectors.

Edge computing, at the core, is about the movement of workloads and models away from the cloud and closer to where the action is. Edge is about managing distributed AI models that can process data and distill insights by running predictive analytics close to source of the data. Next generation edge tools also facilitate the orchestration at scale and include autonomous management.

5G, on the other hand, is a communications protocol and technology set. For the most part, the value lies in new and improved methods of communication, which sometimes include faster, lower latency methods of communicating between devices.

But faster is better, right?

Yes, 5G promises lower latency, higher bandwidth, and network slicing. While 5G might take latency down from 9ms to 5ms as compared to 4G, in reality, that’s from the device to the cell tower. Currently, the end-to-end round trip latency from the device to the Cloud service is 400–500ms. 5G will really only be reducing latency by about 1%. Likewise, the core infrastructure network is still bottlenecked at the Cloud backend.

What we really want is smarter, not faster. That requires moving the machine learning AI prediction model workload closer to the person or process to avoid the backend bottleneck. We need edge computing to fulfill the smarter, not faster desire. Edge computing works with 5G, 4G, LTE, wire-line, satellite, disconnected for that matter — it doesn’t really care.

The basic idea: Place computing resources closer to the user or the device, at the “edge” of the network, rather than in a hyper scale cloud data center that might be many miles away in the “core” of the network. The edge approach emphasizes reducing latency and providing more processing of data close to the source.

Mobile apps working with the edge network could make greater use of artificial intelligence and machine learning algorithms if they didn’t have to rely entirely on their own processors — or drain phone batteries with intense computation. Other frequently mentioned applications include edge computing for autonomous cars, augmented reality, industrial automation, predictive maintenance, and video monitoring.

How do 5G and Edge Computing Work Together?

IoT devices are most effective when they have high levels of connectivity on the network edge. When powered with sufficient connectivity, these devices can transmit large amounts of data in a flash. While IoT devices can store and process data locally, their ability to rapidly communicate information to other devices in the area is what makes them truly revolutionary.

5G and edge computing is a match made in heaven. While 5G technology operates similarly to existing cellular technology to transmit data over long distances, it’s still somewhat lacking from a connectivity perspective.

Take autonomous cars, for example. They’ll need to be able to not only take in data from their own sensors, but also share that data with vehicles on the road around them. 5G will enable these vehicles to take in large volumes of data, but edge computing is what will power them to move that data as needed.

Edge computing architecture keeps data close, while 5G technology gets it where it needs to be as quickly as possible. In short, they’re the peanut butter and jelly of data management.

As 5G infrastructure becomes more commonplace, edge data centers and IoT devices will be able to form processing areas that allow data to be generated, collected, and analyzed locally with minimal latency. This means the network edge will no longer be an edge in the traditional sense, but rather a ring of interconnected 5G networks that makes it easier to manage data and prioritize what information needs to be transmitted back to centralized servers.

Real life use cases of integrated Edge computing & 5G network

Conclusion

The one-two punch of 5G technology and edge computing framework will help companies fundamentally transform the way they design and deliver their network services. Rather than designing their infrastructure from the inside out, the focus will shift to the edge where customers are located. Smaller, more versatile edge data centers will become a critical factor for organizations looking to create more responsive and dynamic networks that can empower their IoT strategies.

As the processing capabilities of IoT devices continue to increase and customers demand more data-intensive services (such as augmented/virtual reality and high fidelity digital media), companies will need to find ways to leverage the speed and bandwidth potential of 5G connectivity to provide them. By integrating that technology with existing edge computing principles, they can overcome the longstanding last mile latency problem associated with existing network infrastructure.

Thank you!!

Monowar Hossain

HOD, Microwave Unit (Planning&Operation)

VEON, Bangladesh

Mobile:+8801962424691

E-mail:monowar.hossain@banglalink.net

Originally published at https://www.linkedin.com.

--

--