Cookie Consent by FreePrivacyPolicy.com Skip to main content
Blogs

Why Edge Computing is set to transform global industry

By 21st May 2019No Comments

While it is arguable that the world doesn’t need yet another technology buzz-phrase, the emergence of ‘edge computing’ is worth understanding.

Put simply, edge computing is when you collect and analyze data on the edge of the network where the data is generated – rather than in the cloud. Today, most cloud-based IT systems will send all of the data to the cloud for storage and processing and then wait for it to come back.

As the Internet of Things (IoT) grows, however, it is becoming clear that it is often unnecessary – and inefficient – to send every single bit of information to the cloud. A quick refresher: IoT refers to the extension of internet connectivity into physical devices and everyday objects. Embedded with electronics, internet connectivity, and other forms of hardware, these devices can communicate and interact with others over the Internet.

For businesses and industry, the key part of IoT will be the data generated by billions of smart sensors and devices. According to Cisco, the “Internet of Everything” – so all of the people and things connected to the internet – will generate 507.5 zettabytes (1 zettabyte = 1 trillion gigabytes) of data by this year. Currently, most of the relevant data management and analysis is performed in the cloud or in enterprise data centers.

Increasingly (and this is where we loop back!), IoT technology providers are exploring edge computing to make the IoT model more efficient. With edge computing, sensors and connected devices will transmit data to a nearby edge computing device that processes or analyzes the data itself – instead of sending the data back to the cloud or a remote data center. In essence, edge computing is often better designed for collecting and processing data from IoT devices than the cloud.

For this reason, Business Insider has forecasted that over 5 billion IoT devices owned by enterprises and governments will use edge computing for data collection and processing in 2020.

In South Africa, there are already use cases of edge computing within IoT (even though people might not classify it as such). Locally, the adoption of edge computing models will also be driven by the increasing availability of power savvy micro-computing devices such as the CloudGate Xs.

Analysts are predicting that edge computing could be a transformative model for industry; including manufacturing, utilities, energy, transportation, healthcare, retail, and agriculture.

Let’s take a look at agriculture for an example of how edge computing would really work. We will use the scenario of an IoT watering system for grapes.  There are critical advantages to bringing local farms into the 4th industrial revolution – particularly with water shortages becoming the norm.

Today, water wastage is very high in many existing irrigation systems. However, new IoT devices are becoming available which can measure soil moisture – right at the root of the plant – and trigger localized watering to bring it back up to a healthy level.

So instead of watering an entire crop, you can now isolate the irrigation to specific plants and only for the required time. For a given crop, you might need 4000 (dumb) devices connected and feeding back data. There is no point sending all of this data to the cloud to be processed so that instructions can be sent back, especially when bandwidth on the average farm might be problematic. The data could rather be fed to an edge device, which determines where the water must be released and for how long. This edge device can then send only the high-level, summarized data to the cloud which the farmer accesses and analyses utilising Business Intelligence (BI) tools and provides more relevant information on monitor water usage, growing speeds, forecasting and much more.

Leave a Reply