Cloud computing is currently one of the most popular technologies. It’s been adopted by both large corporations and individual users, enabling them to store data centrally on remote servers instead of on local computers. However, as the number of IoT devices grows and we leap into a hyper-connected world, data needs to be sent and received faster than ever. The current approach of directly connecting smart devices to the cloud has a number of disadvantages and cannot cope with the diverse needs of IoT applications. Most require a real-time response, which cannot be achieved through cloud computing mainly because of inherent latency. Therefore, experts suggest it’s time for next-gen technology. This is where edge computing comes in.
Top concerns about today’s approach to IoT
The cloud seems to be a natural architecture for IoT devices. But upon closer inspection, scientists from the University of California, Berkeley discovered many issues related with its application to IoT. In particular, they found gaps in security, latency, scalability, bandwidth, availability, and durability that clouds are powerless to fill.
Information stored in a cloud is attractive to cybercriminals, making data security one of today’s most pressing cybersecurity challenges. The issue that worries experts most is that the extremely sensitive information collected by IoT devices and stored in the cloud can be accessed by unauthorized parties, significantly violating users’ privacy.
The cloud is often viewed as an extra component that links smart devices. From a structural point of view, though, it appears to be on the edge of the network. Because it takes time for information to get from one place to another, even the simplest IoT applications are prone to experiencing unpredictable latencies and and behaving unpredictably. Processes such as sensing, transmission, internet delivery, and cloud processing can inhibit speed.
Cisco expects that 50 billion devices will be connected to the cloud by 2020. In simple terms, as the cloud becomes more scalable, chances are it will soon get bloated and the amount of generated data will constrain processes.
The bisection bandwidth requirements for a centralized cloud solution are staggering, especially since most data acquired by IoT devices can or should be processed locally and immediately discarded.
Shipping data to the cloud results in a significant amount of upstream traffic. The problem of current networks, however, is that typically they have more downstream than upstream bandwidth. When IoT devices generate data at the edges of the network – a pattern that will easily saturate the upstream link’s bandwidth, especially at scale.
Quality of service: Interruptions & more
Variable latency and occasional loss of internet connection is not something IoT applications can tolerate, in contrast to average web users. Since IoT devices have a direct impact on the physical world – consider traffic jams or heating, for example – unavailability of sensors can cause damage in real time. On top of that, cloud systems constantly suffer from interruptions due to operator errors, software bugs, DDoS attacks, and many other factors that depend on wide-area routing.
Getting an edge
The above points call on IoT providers to change the status quo in favor of a more reliable solution. Here’s where edge computing comes in.
What is edge computing and why does it matter?
Edge computing is a networking philosophy that suggests bringing computing as near to the data source as possible. Staying on the edge of the data source, edge computing aims to reduce latency and bandwidth. In other words, edge computing intends to run fewer processes in the cloud and shift them to local devices. Localized on a user’s computer, an IoT device, or an edge server, data needs to travel less distance to its final destination, and thus the amount of long-distance communication between a client and server is decreased.
What is a network edge, then? In a nutshell, a network edge is a place where a device (or the local network containing the device) maintains contact with the internet. Yes, you’ve understood it right. The term is a bit vague. Indeed, a user’s computer or even the processor inside of an IoT camera can be regarded as a network edge, but a router, ISP, or local edge server also performs the same function. The key point is that the edge of the network should be geographically close to the device.
Technology benefits and drawbacks
Every day, the number of IoT devices grows. IoT devices are distributed across homes, offices, commercial buildings, and other spaces. According to Statista, there will be over 75 billion IoT devices installed worldwide by 2025. But cloud systems will be incapable of supporting such a large number of devices on their own. Some part of the computation will have to be moved to the edge to create a more reliable channel for data to flow.
Internet of Things (IoT) connected devices installed base worldwide from 2015 to 2025 (in billions)
Edge computing, in contrast to cloud platforms, doesn’t make you wait for technology to catch up to your needs. While the primary benefit of edge computing is minimizing bandwidth use and server resources, another benefit is reduced latency. A network edge will prevent delays that usually occur when a device needs to connect with a distant server. Additionally, edge computing can benefit businesses considerably, providing opportunities for real-time on-the-spot data analysis.
To cut a long story short, edge computing is beneficial because it
- Decreases latency
- Reduces bandwidth use and associated costs
- Cuts down on required server resources
￼ With Edge Computing, networks can scale to meet the demands of the IoT world without having to worry about overconsumption of resources on the network and servers or wasting resources on processing irrelevant data.
Despite many benefits, there are also a number of cons to edge computing:
- Both cloud platforms and network edges are vulnerable to security breaches. Tuning IoT devices with advanced software and hardware and encouraging local data analysis makes devices more susceptible to malicious attacks.
- The more edge computing devices you incorporate, the more the risk of human error grows.
- The cost of edge computing is high, so it’s better to make sure you buy functionality you really need.
Edge computing use cases and value for businesses
There are many use cases when edge computing can bring value to manufacturing, security, healthcare, and other industries. Thomas Bittman, vice president and analyst at Gartner, reports that “the number of enterprises who are saying edge is part of their core strategy has doubled in a year.” He also thinks that by 2020, about half of enterprises will rely on edge technologies as part of their strategy. Here are the areas where edge computing already flourishes:
Smart homes and buildings
Smart homes, equipped with Amazon Alexa or Google Assistant, enable people to complete many routine tasks like turning on lights or changing the temperature. Right now, these actions take a few seconds to happen, but edge computing can make them happen in real time.
There’s also huge potential for network edges in autonomous driving. Bittman acknowledges that to learn how to drive, self-driving cars should not rely on the cloud to process data. On the contrary, they need to run all machine learning processes directly in the car, on board.
In the context of industrial automation, edge computing can come in handy to create machines that sense, detect, and learn things without being previously programmed to do so.
The problem: Complex current IIoT architecture
The solution: Flattened IIoT architecture
Smart cities are another example of where edge computing can be useful, providing a resilient platform. A cloud connection that anticipates the use of Wi-Fi or cellular data cannot guarantee enough coverage – not even with 5G, which is years from mass adoption. In turn, edge computing can help in managing streetlighting, security cameras, worker messaging, sensor monitoring, and other areas of smart cities, at the same time opening opportunities for a more integrated rather than dispersed approach to IoT.
Will edge computing replace the cloud?
Most experts predict it won’t. The cloud will rather complete the edge computing concept, acting as a hub for data storage and advanced data processing tasks, as key aspects such as security, stability, and processing power will continue to drive the need for big datacenters. However, if a responsive solution with reduced latency is required, it’s reasonable to implement edge computing to make it faster and more reliable.
Applying edge computing solutions to IoT development has its ups and downs, but obviously it has an advantage over cloud platforms when data processing is concerned. Soon, it will become the new black for IoT innovators and speed up communication in the world of technology.