Cloud and Edge are part of the same Continuum

Reading Time: 4 minutes


Posted by Marc-Elian Bégin, CEO (SIXSQ) 

Building robust and secured solutions for Smart City and Industry 4.0 requires IoT, Edge AND Cloud

There have been wild predictions recently that the edge will overtake if not kill the cloud in terms of volume of data processed. While it is helpful to raise awareness of the potential of edge computing, which is undoubtedly huge, it is also an oversimplification. Instead of adding fuel to this dispute, I prefer to offer peace and bring those two fundamental building blocks together.

While working with Eduardo Quiñones, Senior Researcher at Barcelona Supercomputing Center on a successful H2020 project proposal (expect some really cool announcements about that shortly), he suggested a good term to illustrate a vision I share of the relationship between cloud and edge computing: continuum.

Handling the IoT Data Explosion

Many modern connected systems now include standard elements such as cloud services and IoT sensors. However, as IoT sensors are producing a rapidly increasing amount of data, architectures have to evolve to cope with scale, and this poses significant challenges. The naive architecture where IoT sensors broadcast data directly to the cloud, via perhaps LoRa, NarrowBand-IoT and the like, or 3/4G (and 5G eventually), might survive in really narrow and simple use cases, but in most real life deployments, this model is doomed to fail. To be fair to my LoRa and NB-IoT friends, there are cases where it’s ok for a sensor to send a short message once a day or every few hours, but these use cases are far from the norm.  Smart cities and industry 4.0 will require a much more powerful approach, involving a close collaboration between edge and cloud processing. This collaboration will evolve over time, and if done right will offer operators an unprecedented flexibility, bringing in effect edge and cloud as part of the same, well… continuum.

Depending on the specifics of each application, raw data will need a different level of processing, and might need to be correlated with other data, or even provide inputs to real-time decision making. The common denominator to all these scenarios is that data must be transformed into exploitable information. And in many cases, time is of the essence. This is why near data processing matters, because it allows you to:

  • extract value from raw data as soon as possible
  • avoid flooding networks with useless or bloated data
  • ensure the right decision is taken at the right time and in the right place

Near data processing therefore requires a close collaboration between the cloud and the edge.

Security in our Inter-Connected World

All systems should be designed with security in mind from the outset. This can take several forms and is not equal for all applications or industries. If we look at smart cities and industry 4.0, for example, many critical infrastructures are involved (e.g. lighting, transport, assembly line, harbour, airport, surveillance), meaning security should be intrinsic and not be added as an afterthought.

In addition, as sensors are becoming more prevalent, we must ensure personal privacy remains protected. This means ensuring, for example, that data generated by video cameras or microphones used in sensors does not leave the edge without proper processing (e.g. anonymisation, encryption). Therefore, good governance must be in place to ensure the right people have access to the right data, and that only specific data, no more no less, is allowed to move around the network.

Future-Proof the System

Most, if not all, modern systems are connected and inter-connected in some way, which means they are likely to evolve. Therefore, manageability is key, such that deployment, operations, maintenance and evolution is cost controlled. This means we must be able to manage software and reconfigure hardware remotely and securely, meaning that operators and planners are provided with a future-proof system.

For example,

  • over the air updates must be possible, in a robust and simple way, even for edge devices that are connected via poor network connections.
  • data retention policies must be flexible, such that aggregation logic can change without risking losing data.
  • peripherals around the edge devices change over time, which means it must be simple to integrate new sensors and actuators in the future.
  • as more edge capacity is deployed, users will want to take advantage of this and re-balance or distribute workloads across several edge devices in a simple and systematic way.

The underpinning principal here is simply that remote control capabilities and manageability allows users to understand, re-configure and update their cloud-edge continuum without needing expensive re-deployment of hardware, or manual and error-prone software updates.  And since the process is largely automated, security is maintained throughout the process.

Bringing Blocks Together: Edge + Cloud + IoT = ∞

From this realisation, let’s look at a potential architecture to address these challenges. The edge was recently introduced, where limited resources are available at the end of the network and close to the sensors. The edge is then connected to the cloud, which can take the form of public and/or private cloud(s), over an IP network (e.g. Ethernet, WiFi, mobile, satellite). Let’s also assume that we treat the edge devices as constrained clouds and hyper convergence systems. And that applications (e.g. analytics, processing, acquisition, aggregation functions) are packaged as virtual machines or containers. The result is the foundation of a cloud-edge continuum. With this in place, and the right smart application management software, we have a continuum for executing applications, where operators can choose what processing to deploy in the cloud or at the edge or both.

With flexible edge devices, ranging from high density compute and storage to very lightweight embedded computers running on batteries and solar power, we are able to create a mist of resources, on which we can deploy any application. In other words, we have created a continuum ranging from data centre environments, all the way to small devices installed in lampposts or in middle of an agricultural field or the ocean.

Real World Applications

But is it achievable? Certainly. This principal of edge-cloud continuum is driving our work at SixSq. Our NuvlaCity and NuvlaScience solutions are good examples of this vision to create this seamless experience. They combine cloud and edge to deliver secure, automated, scalable systems. By embedding this concept in our industrial delivery process, our customers can put this vision in place quickly, securely and at an affordable price. And we can now deliver this experience world-wide.

This is what we are currently deploying in the mF2C H2020 funded project.


(Acknowledge) first published in