Cloud-based data analysis has given a major boost to the development of the Internet of Things, but more businesses are pushing data processing closer to the edge these days. In fact, 50% of the data produced by connected systems through 2019 will be processed on the IoT device level!
Given the fact that IoT devices’ cognitive power is usually restricted by the form factor and a short battery life, is it true to say edge computing will replace the cloud in the coming years?
Why cloud computing is getting Edged out
High latency limits IoT solutions’ effectiveness
The major drawback of cloud-based IoT systems is that whenever users trigger an action, the software tier needs time to capture the command, send it to the server, and wait for a response before presenting the information in graphic form. For instance, this could be a request for device status data gathered over a given period. Such behavior is common for connected equipment, Smart Home products, and even lightweight wearable devices. Latency is regarded as the major barrier to building IoT solutions capable of making autonomous decisions in real time.
IoT devices produce a lot of data — and we don’t need to push all data to the cloud
By 2021, connected devices will produce 847 zettabytes of raw data annually (up from just 218 ZB in 2016). A smart oil rig, for example, can use up to thirty thousand sensors to monitor various performance parameters, including rotating hours, pumping speed, and stroke count. Yet a manufacturing company needs to interpret less than 1% of the sensor data in real time to identify abnormalities and prevent accidents. The remaining 99% is the so-called status data, which can be applied to train predictive maintenance models, but does not require immediate action.
Hackers can intercept the data moving back and forth between an IoT device and a cloud server
Storing and processing IoT data in the cloud is in most cases safer compared to on-premise servers. However, 91.5% of data transactions performed by connected devices in corporate networks are unencrypted. This gives hackers an opportunity to compromise local routers and capture IoT traffic.
Bandwidth and energy costs are rising, but there are no alternatives to cellular connectivity just yet
Major carriers like AT&T and Verizon are rolling out low-power networks for M2M communication, which are cheaper than LTE and save energy by cutting the data rate to just 120 Kbit/s. This, however, won’t guarantee major cost savings in the long run. For one thing, the need for bandwidth is always moving up, which might prompt telecom companies to adjust their pricing plans. Additionally, narrow-bandwidth networks cannot support IoT operations like firmware updates, voice processing, and unstructured video data analysis.
Edge computing could help the Internet of Things’ adopters reduce the amount of data traversing the network, save bandwidth, and design connected systems that automatically execute actions — e.g. send an alert notification to a manager, turn off the lights, or lower the temperature, — once a certain type of behavior is registered.
IoT devices don’t have the “edge factor”. Here’s how to put edge computing to work
There are several drivers making edge computing a viable reality:
That said, many smart devices — especially in the consumer level of IoT, — lack the memory to handle heavy operations, and may in fact be running on firmware rather than operating systems. This is why edge computing deployments to date have mostly been limited to ingesting, storing, filtering, and sending sensor data to the cloud.
In instances when data analysis cannot be performed on a device, fog computing steps in
The technique involves the implementation of intermediary computers, networking devices, and small data centers, which can segment incoming traffic between the data source and the cloud.
Since edge computing deployments require a combination of on-premise and cloud data centers, IoT software development specialists first set up a data processing unit in the cloud, and then mimic its functionality on connected devices within the IT infrastructure.
To accomplish this goal, developers utilize cloud-managed services like AWS IoT Greengrass or Azure IoT Edge. With the services, edge devices can act on the data they generate and simultaneously use the cloud for storage and analytics:
- AWS IoT Greengrass is only supported by Linux-based edge devices, which, in turn, communicate with other gadgets varying in size and complexity from microcontroller-based solutions to industrial equipment.
- Azure IoT Edge allows developers to execute 3rd-party services, AI-assisted data processing, and custom application logic on connected Linux and Windows devices via containers.
Both services ensure near real-time responses, encrypt sensor data, and enable edge devices to work offline or with intermittent connectivity to the cloud, and thus make it relatively easy for smaller companies to design an effective cloud architecture for edge devices.
In search of a killer app for IoT
Powered by new connectivity technologies like 5G, edge architectures will lay the foundation for a faster and more efficient IoT in the coming years.
At this point, however, a healthy balance of cloud and edge computing remains the preferred approach to IoT infrastructure development: despite higher latency and operating costs, the centralized cloud-based data repositories have more storage and processing power than small gadgets.