Edge computing seeks to place data processing capabilities as close as possible to the exact point they are needed. For IoT, this approach can mean lower latency and therefore quicker and more effective performance.
Discover more about edge computing, its purpose, benefits and application in IoT projects.
What is edge computing?
A traditional centralised IT framework is made up of two main elements. On the periphery, there are multiple endpoints whose main job is to capture data and act on instructions. Data from these peripheral devices is transmitted to and from the second element in the framework: a central server. This server is where virtually all of the heavy lifting of the system takes place; i.e. where data is processed, consolidated, analysed and actioned.
Edge computing involves a distributed rather than a centralised approach. Instead of having all computation activities taking place at a central server level, as many processing tasks as possible take place at the periphery, as close as possible to the data source. With more computing taking place at the edge of the network, it reduces the need for long-distance contact between peripheral devices and the central server.
How does edge computing work?
In many edge frameworks, key computational tasks take place at IoT device level. Take the example of an industrial sensor, for instance. Under a centralised framework, the sensor merely captures data and feeds it back to the server. With an edge approach, the sensor device can process that data on-board and automatically trigger particular actions if required, without having to interact with the server.
In addition or as an alternative to device-level processing, the framework might also include the deployment of edge computers or servers. So instead of all peripheral devices transmitting data to a central location, that data is sent to computers that are much physically closer (e.g. on the factory floor). Specific, time-dependent computations can be carried out much closer to the point of need, saving both on time, and on long-distance data transmission costs.
What is an example of edge computing?
One example is a video surveillance system. Increasingly, such systems incorporate relatively sophisticated video management software (VMS) with features such as motion detection and automated headcount monitoring.
In a centralised framework, all video footage is sent to a central server, where it is analysed by the VMS. With edge architecture, the immediate analysis can be done at the camera end. For instance, a smart camera will only start relaying footage to a server if and when motion is detected, and is able to trigger digital tripwires itself. This approach massively reduces the amount of expensive bandwidth required to relay video footage to the remote server.
What is the purpose of edge computing?
The primary purpose is usually to reduce latency. In sensitive IoT applications, for example, industrial processes or healthcare procedures, every second or millisecond counts. By moving processing and computation as close as possible to the edge, you can substantially reduce the gap between a critical situation arising and appropriate action being taken.
Can edge computing replace cloud computing?
Edge computing is best viewed as a complement to cloud computing, rather than a replacement for it.
For collating data from multiple IoT devices, monitoring organisation-wide efficiency, usage trends and performance, you will still need data to be processed and analysed centrally, most likely in the cloud. Meanwhile, time-sensitive, device-specific processing can take place at the network edge.
What are the benefits of edge computing?
Responsiveness
Large physical distances, network congestion and transmission outages can delay the time it takes for a system to respond to what may be critical situations. With edge computing, you don’t need to send data to a remote analytic centre for decisions to be made, so your IoT devices can respond quicker.
Cost
Where data is processed at device level or else relayed to edge servers, you can substantially reduce the volume of data that needs to be transmitted to distantly located centres. It means lower bandwidth requirements, less congestion across the network, as well as lower data transmission costs. It also means additional edge compute resources, either in the device or an edge computer, which results in additional costs so system architecting is important.
Reliability
In some environments (e.g. rural agricultural sites and offshore facilities), connectivity may be restricted or unreliable. If IoT devices can execute core tasks autonomously, you are much less likely to suffer business disruption as a result of connectivity issues.
Privacy
Transmitting personal data across national or regional borders for processing can lead to regulatory issues (GDPR, for instance). With an edge computing approach, you can process raw data at the periphery and then obscure and secure it before onward transmission to data centres in other jurisdictions.
Security
Edge computing can help ensure that a greater slice of raw data remains at the source. There is generally less data being transmitted, and any data that does need to be sent can be streamlined and more easily secured. It means a smaller attack surface and less chance of a data breach.
What are the disadvantages of edge computing?
Hardware enhancements
If more computing is going to be done at the network periphery, your edge devices need to be suitably equipped for the job. This may mean a significant outlay on device upgrades or replacements. Depending on your framework model, there is also the cost of additional edge servers to factor in.
Connectivity
Edge computing significantly reduces the need for long-distance data transmission but doesn’t eliminate it entirely. You still have to think about an appropriate channel for at least some minimum level of communication with the central server.
Security
Boosting the capabilities of your edge IoT devices may also lead to additional exploit opportunities for attackers. A robust approach to device management is essential, including patch distribution and suitable encryption for data in transmission.
Does edge computing need 5G?
It’s probably more accurate to say that 5G enables some hybrid edge computing architectures to emerge. For example, Distributed Multi-Access Edge Compute or Distributed-MEC (DMEC).
With the fifth generation of cellular technology, it becomes possible to transfer huge amounts of data in real-time. This opens the door to a much wider range of IoT applications, including those that utilise artificial intelligence (AI), machine learning, multimedia streaming and even virtual reality.
Under the 5G standard, the latency rate (i.e. the delay between sending and receiving information) is reduced from 4G’s 200 milliseconds to 1 millisecond. In reality, in order to meet this benchmark, and to deliver the type of feature-rich, ultra-responsive services that businesses and their customers will expect, it’s almost certainly going to take more than cellular network upgrades. If huge volumes of data are still being transmitted to a central server, businesses risk lag times. Shifting the lion’s share of processing closer to the edge will help free up bandwidth and will ensure you realise the full benefits of 5G.
With Distributed-MEC (or DMEC) the edge computing can be performed in the mobile core network. DMEC leverages the ultra-low latency 5G network and also removes latency and lag associated with internet connections. It can also remove hardware costs on both device and remove the need for any edge computer equipment.
Find out more
For an expert assessment of your connectivity needs and to discover the best fit M2M options for your business, speak to Wireless Logic today.
Learn more about our connectivity solutions for a wide range of use cases here.