IoT, cloud, fog, edge, … do I hear more?

Reading Time: 4 minutes

 

Posted by Xavi Masip, Scientific Director (UPC)

 

Recent advances in technology have paved the way to coin a slightly large set of fancy names, sometimes unfairly used by the scientific community and firmly introduced in the market by industry providers to wear their products with an aura of novelty. However, it is with no doubt that technology evolution is easing the development of novel services particularly fuelled by two main global cutting-edge challenging concepts, thing and cloud.

The large and unstoppable deployment of devices with connection capacities at the edge (refereed to as “things”), including those traditionally utilized by final users (laptops, smart phones, etc.), but also novel devices adopting smart capacities (cars, small wearables and sensors/actuators, city infrastructure, etc.), have empowered the development of novel “smart” scenarios. Indeed, the evolution in computer architectures, the progress in batteries consumption and duration along with a large, stable and broad connectivity (ubiquity) make any device at the edge to become a “computer”, thus assessing that any “thing” may be connected, setting the concept coined as the Internet of Things (IoT). The deployment of IoT is largely impacting on the development of smart scenarios, bringing in disruptive services in many sectors, such as smart transportation, smart homes, e-health, smart manufacturing, and probably the most known, smart cities. From a small sensor –monitoring power consumption in a manufacturing chain to detect failures–, a traffic light –monitoring traffic congestion or pedestrian misbehaviour–, a smart car –with capacity equivalent to more that hundred Mac Book Pros all connected in a single platform talking to their neighbours and potentially used as a distributed platform for services computation–, to a small and lightweight sensor –detecting hearth beating to monitor arrhythmia–, the new opportunities for novel services and applications are say unforeseen and unlimited.

 

On the other hand, cloud has become one of the most appealing ICT advances in the recent years, mainly driven by the capacity to offload “tasks” to cloud, i.e., far from the local device (host) requesting the task to be executed, thus reducing own resources consumption (energy, bandwidth, storage, etc.), while also offering a pretty large set of capacities (in terms of IaaS, PaaS, SaaS). In short, the decoupling of service execution to cloud is substantially benefitting users and service providers, although is also imposing some limitations on the services to be executed. For example, forwarding a decision-making process (or any kind of smart processing strategy) to cloud, would require to carefully analysing two main constraints that will undoubtedly impact on the whole service performance. The first relates to the delay, easily observed when considering that cloud is located far from the host requesting the service. The second refers to the data, and particularly deals with the fact that the data required to run the decision-making processing will be traditionally collected at the edge through some sensors, to be later forwarded to cloud, thus consuming a significant amount of network resources. Additionally, we may consider aspects related to security and energy consumption as well. In summary we may conclude that many opportunities are brought by cloud, although some of its inherent limitations may hinder the deployment of some services.

A recently coined solution to face the limitations brought by cloud boils down to moving such a processing back to the edge. When doing that, network is not unnecessarily overloaded and the delay is highly reduced. Such a shift concept is being refereed to as fog computing (also known as edge computing). The main idea is to near the smart processing to the host requesting the service and actually where the data is to be collected.

In summary, the whole scenario shows a host requesting a service, along with a large set of resources, from the very edge (close to the host) to the cloud, where the service may be allocated to, seeking for an optimal service performance.

Put that way, some kind of coordination would be mandatory to properly and efficiently manage the whole stack of resources (referred to as Fog-to-cloud, F2C). Aligned to this need, the main objective of the mF2C project is to design and develop an open, service and technology agnostic, secure solution, to manage the whole set of resources from fog to cloud, seeking for an optimized services execution. To that end the mF2C project envisions a hierarchical management architecture, setting different layers putting together the different resources from fog to cloud according to a certain policy, where services are allocated aiming at optimally match service requirements and available resources in the whole F2C stack. A similar approach has been also recently proposed by the OpenFog Consortium in its Reference Architecture.

The benefits introduced by the mF2C model are mainly rooted on a smart coordination of fog and cloud resources, so making the most out of deploying both computing paradigms. Thus, limitations brought by cloud (delay, traffic overload) may be fixed by using fog, and limitations brought by fog (limited capacities, volatility) may be fixed by cloud. Besides, the mF2c project also considers distributed services execution at different resources located at different layers, all running either sequentially or in parallel, again, intended to benefit as much as possible from both the whole stack of resources and a smart and coordinated management.

But, the goodness of the mF2C proposal does not only fall into the technical area. From a business perspective, the mF2C is expected to bring new opportunities leveraging the collaborative model defined in the project where devices may share theirs resources according to some policies, as well as by introducing the parallel service execution model envisioned in the project.

Finally, it is worth emphasizing that mF2C is setting a whole set of problems to be solved, from security or quality provisioning to mobility, dynamics and volatility aspects, what makes the project and the technology behind, to become a very attractive research avenue, where the scientific community and the industrial sector may, and is expected to, contribute.