Edge Computing: what it is and why it could be our future

The millions of devices in the Internet of Things that surround us have a problem: they collect information, but they don’t do anything with it. They send it to the cloud, where large data centres process it to draw certain conclusions or trigger certain events.

This “passive” functioning of all these devices is what Edge Computing wants to change, a type of philosophy applicable especially in business and industrial scenarios that brings much more autonomy to all these devices, making them a little “smarter”.

Analysis and results on-premises, not in the cloud

Until now, in most cases, the big Cloud Computing platforms have been doing the “dirty work” of analysing the data collected by sensors and IoT devices.

The efficiency of this paradigm is not optimal where the network nodes themselves can analyse that data void going through the cloud.

As NetworkWorld notes, so-called edge computing “allows data produced by IoT devices to be processed closer to where it was created rather than being sent over long distances to reach data centres and compute clouds”.

This has a fundamental advantage, as it “allows organisations to analyse important data in near real-time, something that is a clear need in many industries such as manufacturing, healthcare, telecommunications or the financial industry”.

It is precisely this definition that explains this new trend whereby devices and sensors located everywhere are not only collecting this data and sending it to the cloud, but also processing it directly. Industrial applications in this area are diverse, and such an approach can indeed bring about a noticeable improvement in many processes.

Too much data, and perhaps too many IoT devices

Firms such as McKinsey & Co. estimate that the Industrial Internet of Things (IIoT) will produce revenues of $7.5 trillion by 2025.

One of the problems facing this prediction is how to manage the huge amount of data that all these devices will contain. How many will exist by then?

The numbers are difficult to measure at the moment. “It’s human nature,” explained a Gartner analyst, “if you’re in this industry, and you’re launching new products, there’s a lot of excitement about it and a lot of expectations.

The truth is that many of these connected devices will have not only data collection capabilities, but data processing ones. And if they don’t have it, nodes in those enterprise or industrial networks will. This is where the Edge Computing philosophy takes shape, and where, among other things, there will be efficiency gains: not having to transmit all that data to the cloud will already mean significant savings.

Fog computing

There is another term closely related to Edge Computing that is being used more and more in this area, and that is Fog Computing. As a Cisco study revealed, this platform allows “extending the cloud to be closer to the things that produce and are driven by data from IoT devices”. The study added that “any device with network connectivity, compute and storage capacity can be a node in this ‘fog’.



This philosophy could be said to allow large cloud data centres to “delegate” part of their responsibilities to Edge Computing devices, and to do so through Fog Computing, which defines requirements or needs at that end of this ecosystem that, as we have said, has clear industrial applications.

Edge Computing refers specifically to how computational processes are carried out in “edge devices”, IoT devices with analysis and process capabilities such as routers or network gateways, for example.

In contrast to this concept, Fog Computing refers to network connections between edge devices and the cloud. The OpenFog Consortium comprising Cisco, Intel, Microsoft, Dell EMD and academic institutions has long been working on specifics for such deployments where Edge Computing, Fog Computing and Cloud Computing systems interact.

Present and future benefits



There are also factors that will make this type of paradigm even easier in the future: the decreasing cost of devices and sensors coupled with the increasing power of even modest devices.

There are also industrial needs that contribute to the commitment to edge computing: in certain environments, the only way to further optimise processes is to try to avoid communicating with the cloud as much as possible. This reduces latency, consumes less bandwidth – it is not necessary to send all data to the cloud at all times – and provides immediate access to analysis and assessment of the status of all these sensors and devices.

There is another interesting advantage: security. The less data there is in a cloud environment, the less vulnerable that environment is if it is compromised. If security in these Edge Computing “micro data centres” is properly taken care of, this section could gain a lot of ground.

This does not mean that dependence on the cloud and cloud computing environments will disappear: both trends must contribute, and edge computing, for example, is more appropriate when speed and low latency are needed above all in data transfers, while the cloud will continue to play a key role in analysing and processing large amounts of data that require significant computing power.


Autonomous cars, the perfect example

If there is one area where this kind of philosophy makes sense, it is in autonomous cars. These “data centres on wheels” are constantly gathering information about their systems and their environment, and all this information needs to be processed in real time for optimal and safe autonomous driving.

Intel estimates that an autonomous car could end up generating 4 TB of data per day: the car’s cameras alone will be responsible for transferring between 20 and 40 Mbits per second to the system, plus another 10 to 100 Kbits per second from radar. This is only part of the data that the car will have to manage in order to function properly.

The autonomous car cannot wait to communicate with the cloud and wait for a response: all this data processing and analysis must be done in real time, and this is where Edge Computing comes into play, confirming the important role that the car’s central computer plays in bringing together, analysing and responding to the needs of autonomous driving at all times.

These real-time needs are added to other equally interesting needs which, as we said, mean that the relevance of the cloud cannot be ruled out: everything “learned” by an autonomous car can be transferred to the cloud so that the rest of the models can benefit from the experience and the most appropriate reaction to different events.

The ultimate idea behind this increasingly popular trend is essentially quite logical: leverage the resources in each case – in cloud data centres or in your system full of sensors and IoT components – to optimise that processing power and data transfer. It remains to be seen whether that paradigm will catch on.


Sei interessato ai nostri servizi o prodotti?

Riferimenti societari

ER TLC srl

P.IVA: 14940681001


Sede Legale

Piazza G. Marconi 15, 00144 Roma

Sede Operativa

Via Italia, 3, 20900 Monza