Fog / Edge-computing and Cloudlets

Viral Sheth
7 min readFeb 13, 2019

--

Cloud Computing is a paradigm where computer system resources are rapidly provisioned to the clients from a shared pool over internet. With the low cost of high-end infrastructure coupled with the assurance of ease of scalability and flexibility, Cloud solutions have gained rapid popularity in short term. However, with the evolution of Internet of Things (IoT) devices, the industry has realized that Cloud architecture cannot handle this huge influx of information coming from a variety of resources possibly located at different corners of the world. While the core concept of Cloud, which dictates that the shared resources be centrally located, offers cost-effective solution to the clients, it causes delays and performance bottlenecks when the data to be processed in the cloud resides far away from the centralized cloud location. This required development of new small-scale clouds, called cloudlets, near the ‘edge’ of the overall cloud infrastructure, so that the distance between the data and the processing unit is minimal. This improvement in Cloud Computing is called Edge Computing or Fog computing. Fog / Edge computing is a nascent field. It has tremendous potential.

Background

Cloud Computing had its golden decade from around 2005 to 2015. During this period, a number of advances in Cloud technology were taking place, for instance, big data, MapReduce, NoSQL databases, Internet of Things (IoT), etc. The concept of Cloud Computing was straightforward. A powerful resource pool with theoretically infinite computing capabilities were made available at a centralized location called Cloud, accessible through internet. It is just not practical for the individual companies to buy this processing power for their transient computing needs. Thanks to the Cloud Computing technologies, these companies were able to ’rent’ the otherwise not affordable resources from Cloud providers and meet their business objectives. However, despite the benefits offered, Cloud Computing itself is seeing a paradigm shift. Prolific emergence of the rich IoT devices have warranted further improvements in the Cloud technologies in form of so called Edge and Fog Computing. The idea of Edge / Fog computing is simple. The heavyweight lifting is now shifted from the centralized clouds to somewhere closer to the IoT devices that produce and consume data. Bringing the processing close to the data generation and consumption is necessary to allow for ultra short response time — a mandatory requirement for many IoT devices.

Terminology

  • Internet of Things (IoT) — The term ”Internet of Things (IoT)” was first introduced by Ashton et al from Proctor & Gamble for supply chain management. The Internet of things (IoT) is the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these things to connect, collect and exchange data.1 Previously, internet connectivity requirement was never perceived for any devices other than computers, phones or tablets. In the present era, ordinary devices like microwave ovens, refrigerators, vehicles, etc also require internet connectivity in order to perform some novel and sophisticated functions. E.g. Amazon Echo is an IoT speaker device that is connected to internet to provide its users with virtual assistant service.
  • Cloud Computing — Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. A cloud can be private, community, public or hybrid. Cloud providers offer three types of service models — (i) Software as a Service (SaaS) (ii) Platform as a Service (PaaS) (iii) Infrastructure as a Service (Iaas).
  • Fog Computing — Bonomi et al from Cisco were the ones who coined this term to refer to extending Cloud Computing to the edge of an enterprise’s network. Fog sits in the middle of the IoT devices and the cloud / data center. Fog layer consists of network connected computing and storage devices like controllers, switches, routers and servers. These devices are called Fog nodes. Diverse IoT devices produce terabytes of data, which if sent to the centralized cloud servers can result in latency due to the sheer size of data. Fog nodes being in close proximity of the IoT devices, avoids this cross-network traffic and addresses the latency issue in the traditional cloud architecture. Fog layer performs pre-processing of the data and then securely transports it to cloud for more expensive computing and storage.
Fog Computing
  • Edge Computing — Edge Computing has the same objective as the Fog Computing does. Both Edge and Fog computing are very similar in that the idea is to move the data analytics tasks from the centralized cloud server to the devices at the edge of the network of the IoT devices and thereby reduce latency. However, in case of Edge Computing the edge computing happens directly on the devices to which the sensors are connected while in case of Fog Computing, the edge computing happens on the devices connected to the IoT devices by LAN. Dastjerdi and Buyya describe that in Edge Computing, the edge nodes can’t handle the work load if there are multiple IoT applications competing with each other and that Fog Computing overcomes this limitation by making use of local clouds called cloudlets.
  • Cloudlet — The term ”Cloudlet” was first thrown by Satyanarayanan et al. A Cloudlet is a small-scale datacenter or cloud located at the edge of the internet. Its objective is to bring cloud-computing capabilities closer to the consumer. Cloudlets are region specific and typically used for mobile consumers or devices. When a mobile device moves from one region to another, it connects from one cloudlet to another. Both clouds and cloudlets achieve user isolation through virtual machines. However, as the mobile device moves from one region to another, its current cloudlet has to hand off the user’s virtual machine to the new cloudlet.
  • Mobile Edge Computing — Mobile Edge Computing (MEC) is Edge Computing applied to Mobile devices.
  • Micro Data Center — A Micro Data Center (MDC) is a small and containerized data center. A typical full-fledged data center may have hundreds of racks of servers whereas a micro data center has a fewer racks of servers with 4 or less servers per rack. In the context of Cloud Computing, micro data centers are the small-scale data centers hosted in the cloudlets. A micro data center is not a requirement for Fog Computing. Rather it is an enhancement to Fog Computing. Its goal is to bring down the cost of the centralized data center by breaking it into multiple micro data centers distributed at different locations. This however, works hand-in-hand with Fog Computing which aims at reducing the latency by shifting the computing from the centralized cloud to the edge of the network.
  • Cognitive Computing — Cognitive Computing refers to a platform that leverages artificial intelligence, machine learning, natural language processing, speech recognition and computer vision to perform tasks like face detection, speech detection, behavioral recommendations, risk assessment and fraud detection.

Advantages of Fog / Edge Computing

  1. Reduced response time — In Fog / Edge computing, the cloudlets or the fog nodes are closer to the IoT devices than the centralized cloud is. In traditional cloud computing, the data emerging from the data producers take multiple wireless hops to reach the far-located centralized Cloud server. However, in case of Fog computing, data typically take only one wireless hop to reach the fog nodes for being processed. This results in low latency and fast response time.
  2. Scalability — In Fog / Edge Computing, the heavier weight-lifting is done by individual cloudlets serving different regions. Only a small portion of data is sent from a cloudlet to the centralized cloud. Since the combined volume of data coming from all IoT devices to the centralized clouds is lower than the traditional Cloud Computing model, the centralized cloud in the Fog Computing setup can cater to significantly more number of IoT devices without needing additional bandwidth. Secondly, the Fog Computing setup simply requires additional cloudlets if the number of IoT devices or consumers increases. On the other hand, if the number of consumers reduces, it just requires decommissioning the least busy cloudlets. Overall, this setup offers easy scalability to the Cloud Computing infrastructure.
  3. Privacy — The cloudlets based Fog Computing solution offers fine-grained control to the end users to control the access to their sensitive data. In a traditional Cloud Computing model, users do not typically feel confident about the privacy of their sensitive data stored and processed in the centralized cloud.
  4. Controlling Cloud Outages — Since the data are stored in cloudlets instead of clouds, a catastrophic failure in a cloudlet doesn’t result in a denial of service to the end user. In case, a cloudlet becomes unavailable due to a network outage or a cloudlet failure, the user can be redirected to the next nearby cloudlet and still be serviced.

References

  1. Arwa Alrawais, Abdulrahman Alhothaily, Chunqiang Hu, and Xiuzhen Cheng. Fog computing for the internet of things: Security and privacy issues. IEEE Internet Computing, 21(2):34–42, 2017.
  2. Kevin Ashton et al. That ‘internet of things’ thing. RFID journal, 22(7):97–114, 2009.
  3. Ketan Bhardwaj, Ming-Wei Shih, Ada Gavrilovska, Taesoo Kim, and Chengyu Song. Spx: Preserving endto-end security for edge computing. arXiv preprint arXiv:1809.09038, 2018.
  4. Mahadev Satyanarayanan, Victor Bahl, Ram´on Caceres, and Nigel Davies. The case for vm-based cloudlets in mobile computing. IEEE pervasive Computing.
  5. Flavio Bonomi, Rodolfo Milito, Jiang Zhu, and Sateesh Addepalli. Fog computing and its role in the internet of things. In Proceedings of the first edition of the MCC workshop on Mobile cloud computing, pages 13– 16. ACM, 2012.

--

--