Edge Computing vs Fog Computing

Ridwan Shariffdeen
6 min readOct 6, 2017

--

Last year when I was attending the IEEE CLOUD16 conference in San Francisco there was a buzz about something called “the fog”. I was attending to the keynote speech by Kerrie Holley then CTO, Analytics & Automation Platforms at Cisco Systems where he spoke about the new trends in software engineering, the emergence of devops practices and the role of the cloud, big data and machine learning for automation. One thing he mentioned that stuck me was the emergence of something called the fog, in his own words it was similar to cloud but not exactly cloud. The buzz of fog was around the conference especially with the IoT research work. At a time when the term ‘cloud’ was not that familiar with everyone (even though some have heard not many actually know what it is), fog computing was a bit confusing, at least to me.

Recently I found out another buzz named “edge computing” and it referred to the same or at least similar definition (according to my understanding) to what fog was all about. I came across a Congress on Edge Computing named MEC Congress which stands for Multi-access Edge Computing (formerly as Mobile Edge Computing) Congress. This was the opportunity for me to attend this event and learn about Edge computing technology from the experts themselves. MEC Congress 2017 held in Berlin, Germany was attended by many big tech companies such as Intel, Google etc and many other leading companies from different industries that uses Edge computing in their industries, such as Toyota, Huawei, AT&T, Vodafone, Verizon, Sony etc. I was fortunate to get an VIP invitation for the whole event to attend the workshops, conference and award ceremony as well. There will be separate article focused on that.

So, what exactly is fog computing? and what is edge computing? do they relate to each other or is it the same thing with two different names? In order to explain what each of this means, first lets touch base on what is cloud computing. Cloud computing was a revolution few years back which changed how we process data efficiently using resources, which is the practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer. With the rapid growth of technology and emergence of Internet of Things, there are lots of data to be processed and the demand for it has sky rocketed. Cloud computing enabled us to share resources and fasten our computational work remarkably, but there was one bottleneck that everyone expected would be the single point for performance degradation for the cloud model, which is the network bandwidth.

Today the technology has developed immensely that you can share your experience with others in 4k resolution or at least full HD. The internet has changed from a information pulling source to data feeding mechanism, computational power is going back from centralized to distributed architectures, streaming videos has evolved to augmented reality and virtual reality, thus enabling many advanced features to the end users. But this all comes with a price for the network bandwidth. So many data going around, Telecos and middle-ware providers have to keep up with the demand and constantly pushing the limits further. We have now developed 5G technology to keep up with the high-speed demand and never ending increase for quality of service. But the question is will the cloud alone be able to withstand the high-amount of traffic that is been generated? do we really need to send everything to the cloud? can’t we have more computational processing closer to the device (edge of the network)?

In an attempt to solve this while keeping the cloud model, two solutions were presented which involve pushing intelligence and processing capabilities down closer to where the data originates, the fog computing and the edge computing. In respect of manufacturing and automation industries this means a network and system architectures that attempt to collect, analyze, and process data from these assets (data generating physical assets or things deployed at the very edge of the network — such as motors, light bulbs, generators, pumps, and relays) more efficiently than traditional cloud architecture. These architectures share similar objectives:

  • To reduce the amount of data sent to the cloud
  • To decrease network and Internet latency
  • To improve system response time in remote mission-critical applications.

Fog Computing

Fog computing is a term “cisco” developed for the extended cloud computing model in which the concept according to Wikipedia is

an architecture that uses one or more collaborative end-user clients or near-user edge devices to carry out a substantial amount of storage (rather than stored primarily in cloud data centers), communication (rather than routed over the internet backbone), control, configuration, measurement and management (rather than controlled primarily by network gateways such as those in the LTE core network).

In simple terms its an architectural concept to bring computational power and storage capability and networking optimizations to the edge of the IoT devices to address the challenges of latency, bandwidth, security etc. This would enable IoT to focus on its own growth rather than worrying about the offloading and processing problems. Thus OpenFog consortium was founded to create a framework which standardizes how fog computing should be built with interoperability, and an open architecture. A framework to create a secure and robust multi-vendor interoperable fog computing environment.

You can find more information about the framework from this site: https://www.openfogconsortium.org/ra/

To sustain IoT momentum, the OpenFog Consortium is defining a new architecture — fog computing — that brings information processing closer to where the data is being produced or used. — Source CISCO

Edge Computing

Another group formed to drive edge interoperability is the Edge X Foundry, an open source consortia approach managed by The Linux Foundation and seeded with some 125,000 lines of code developed internally by Dell Technologies. According to Wikipedia edge computing is

a method of optimizing cloud computing systems by performing data processing at the edge of the network, near the source of the data. This reduces the communications bandwidth needed between sensors and the central data centre by performing analytics and knowledge generation at or near the source of the data. This approach requires leveraging resources that may not be continuously connected to a network such as laptops, smartphones, tablets and sensors.

In edge computing, physical assets like pumps, motors, and generators are again physically wired into a control system where the PAC (point of an automation controller) automates them by executing an onboard control system program. Intelligent PACs with edge computing capabilities collect, analyze, and process data from the physical assets they’re connected to — at the same time they’re running the control system program. In edge computing, intelligence is literally pushed to the network edge, where our physical assets or things are first connected together and where IoT data originates.

More about the OpenEdge architecture reference: https://documentation.progress.com/output/ua/OpenEdge_latest/index.html#page/gsdev%2Fan-overview-of-openedge-high-level-application-a.html%23wwID0ES3DK

Source: www.ntt.co.jp

Summary

While “fog computing” and “edge computing” are overly simplified concepts that simply rehash ideas that we’ve had before, the real opportunity lies in configuring the “nodes” and optimizing their performance. The primary difference between your IoT device communicating with a node versus the cloud is that bi-directional communication with a node can take milliseconds while conversing in the same manner with the cloud can take minutes. In my opinion these are two different solutions for a single problem to optimize cloud performance. While the fog computing is been more favored by the service providers or the data processing companies, the edge computing is favored by the telecos and middle-ware companies that actually owns the back-bone network and radio network.

Hosting analytics, performance processing, and heterogeneous applications closer to physical centers and control systems, can help enable edge intelligence. They will help us move portions from cloud-based applications closer to devices which use them. It isn’t easy to figure out what software tasks to remove from the cloud, but the growth of bandwidth-consuming devices may force us to take a different approach.

--

--