Managing the Wild, Wild Edge

BrandPost By Pete Bartolik
Apr 16, 2021
Technology Industry

Operating edge devices across multiple environments requires finding a consistent approach for managing interfaces, processes, and protocols.

istock 692819426
Credit: iStock

The concept behind edge computing is simple: bringing compute resources and analytics capabilities closer to where data is created. Edge can help organizations create smart vehicles, for example, as well as smart factories, smart homes, and smart cities. But there’s little consistency across those environments—each has unique characteristics that need to be harnessed.

For businesses that plan to operate edge devices across multiple environments, the challenge is finding a consistent layer to manage the inherent inconsistencies in interfaces, processes, and management protocols.

Some companies want to leverage unprecedented numbers of Internet of Things (IoT) devices. Telcos are looking to maximize the potential of 5G devices at the edge. Manufacturers are eager to manage their own 5G local area networks in previously wireless-hostile environments. Consumer product companies want to be able to deliver new services and new capabilities.

Each of those use cases envisions leveraging intelligence as close to a device as practical, to overcome latency and to automate actions in real time. The edge does not operate in isolation, however; cloud services are crucial for analyzing huge amounts of data generated at the end as well as creating and fine-tuning the algorithms that  can be downloaded to edge systems.

“Edge is all about distribution and scale,” says Pete Cruz, technical marketing manager with Red Hat. “Organizations want to push data and apps closer to the users and the technology that consume them.”

But deploying and managing apps and data at the edge is a far cry from managing a data center where everything is under centralized control. Organizations must adjust to greater autonomy out on the edge, and new levels of scale, both of which require a high degree of automation.

“Scale, consistency, compliance and security are huge concerns,” says Cruz. “You have to bring automation into the mix so you can distribute the right apps to the right locations and do so with the right consistency. If something goes awry you need to know how it was distributed.”

Red Hat Enterprise Linux (RHEL), along with the OpenShift hybrid cloud platform, provide organizations with an edge in managing the edge, Cruz says. RHEL is the leading Linux distribution and provides a single production-grade Linux platform that can span the entirety of an enterprise, from on-premises servers to the public cloud and from core data centers to the farthest-flung edge devices.

Built to provide the levels of supportability, stability, and security features required by enterprise edge deployments, RHEL also provides the foundation for Kubernetes orchestration and management software containers on the OpenShift Container Platform.

Extending Red Hat’s open hybrid cloud model to the edge enables organizations to rapidly create operating system images for the edge; stage and apply updates at the next device reboot or power cycle; and support over-the-air updates for locations with limited or intermittent connectivity.

As innovation unfolds at the edge, the open, standards-based hybrid cloud model will provide organizations with the ability to architect edge deployments and tame the new frontier.

Red Hat sees edge differently. See how: https://www.redhat.com/en/topics/edge-computing/approach?sc_cid=7013a000002w1CwAAI