Several years ago, a prominent software engineer coined the term ‘data gravity.’ Dave McCrory was referring to the idea that applications and services tend to form wherever large amounts of data exist. Until the past several years, those producing the greatest ‘gravity’ were data centers and — more recently — clouds. But, as the world connected billions of devices through IoT infrastructures, new clusters of data are rapidly forming ‘on the edge.’
Of course, the pandemic greatly accelerated the generation of data at the edge, as corporations and governments scrambled to enable remote workers and develop new ecosystems to connect with the customers and citizens they serve. This was already underway with IoT-fueled trends — such as Industry 4.0 — but the pandemic compressed seven years of digital growth into a 12-month span in 2020-21.
The edge computing trend is only accelerating. Industry observers predict that more than 75% of all data will be processed at the edge by 2025, along with 30% of workloads. These growth metrics are incredible, and they make edge computing one of the most abrupt and game-changing shifts in the history of IT.
Two core questions: What is edge computing and why is it growing so fast?
‘The edge’ is simply the place where data is generated. From an IT perspective, it’s where the action lies for any connected organization. It’s the deployment of processing power right where data is generated and collected, so decisions can be made instantly, and actions taken in real-time. For example, new automotive designs include a bevy of sensors to collect data for driver assist or autonomous driving capabilities. Manufacturing equipment is loaded with sensors to collect performance and maintenance data. The same is true for implanted healthcare devices, oil rigs, mine shafts and construction sites.