In an earlier post, we explored the case for embedded intelligence in edge devices. In that post, we noted that edge devices with embedded intelligence have enough compute power and advanced programming to make their own operational decisions based on local data.
That’s the story of one intelligent device. But what happens when multiple edge devices need to work together and feed data to each other? This is where the edge cloud comes into play. As explained by my colleague Ryan Andersen, Director of Strategy for Dell Technologies, an edge cloud acts as an intelligent overlay to help capture, curate, analyze and protect the data generated by many edge devices. With the proper edge-cloud infrastructure, edge devices can learn together, making the collective whole more intelligent than the sum of its parts.
Let’s consider an example. A logistics use case might leverage real-time data on the inventory in a warehouse, the location of trucks, current traffic patterns, weather conditions and more to optimize operational processes. This use case requires integrating and analyzing data from multiple edge devices to gain more value from each device.
So how and where do you process all this data coming from different edge devices? If the data needs to be analyzed as it is generated, it probably makes sense to process it in an edge cloud that puts the analytics capabilities close to where the data is generated, rather than sending the data to a centralized private or public cloud data center.
This brings us to the all-important issue of service constraints that play into the use cases for edge cloud. These constraints include network latency, data volumes, connectivity issues, endpoint locations, and requirements for data security and privacy. In many cases, these constraints will dictate that you process data locally, preferably in an edge cloud. For example, if you have large amounts of data, limited network bandwidth and real-time usage requirements, your use case might dictate (or constrain) that data be processed at or close to the edge.
A new IDC InfoBrief sponsored by Dell Technologies, explores the edge cloud concept in more detail and summarizes how service constraints drive the need for edge cloud in various use cases. The brief shows, for example, that while fleet management and field services have a moderate propensity toward edge cloud processing, public safety and emergency response have a very high propensity toward the edge — because these are applications where every second can count.
How do you get started down the path to an edge cloud? Ryan advises starting with a clear customer-focused or operation-focused use case, and then building out an integrated business case that highlights the customer and operational benefits that stem from that use-case and edge cloud.
“When it comes to building the edge cloud infrastructure, it’s important to think broadly, about multiple use cases,” he says. “While you may be building an edge cloud to support a certain application, you want to deploy the infrastructure as if it were a multipurpose private cloud that will support various use cases. Today that might be a logistics application. Tomorrow that might be an asset management or customer service application.”
With this same long-term view in mind, you will want to partner with a technology provider who has an end-to-end reach, Ryan advises.
“While there are many niche players who could help you solve certain pieces of the edge cloud puzzle, you need a partner who can help you conqueror the entire challenge,” Ryan says. “You want a partner who has the experience, proven solutions and access to a rich ecosystem to take you from an initial proof of concept to a global deployment.”
A growing sense of urgency
When should organizations start to consider their edge cloud strategies? The answer is driven by the dramatic and ongoing growth in the number of connected edge devices that provide the data that organizations are using to differentiate their products and services. Just consider these figures cited in IDC’s edge cloud InfoBrief:
- By 2023, half of on-premises infrastructure deployed will be in critical edge locations, up from less than 10 percent today.
- Over the same period, the number of applications running on edge infrastructure will increase by 800 percent and data creation at the edge will grow 1.6 times faster than other enterprise data sources.
- By 2023, the edge will lead to the deployment of 25 billion endpoints capable of AI inferencing.
When you consider growth and diversity like that, it’s clear that organizations need an edge strategy enabled by the edge cloud. Those organizations need to focus on how they will process, analyze, learn from and leverage the collective intelligence in edge devices. The edge cloud is the key to making that happen.
To learn more:
For a deeper dive, read the IDC InfoBrief “The Edge Cloud: Enabling an Intelligent Digital World.” And for a look the ways in which organizations are using edge computing and edge cloud solutions to unlock the full potential of data, visit DellTechnologies.com/edge.