Red Hat Ansible Automation Platform and OpenShift Kubernetes can team up to automate and orchestrate edge applications and devices. Credit: iStock Unlike traditional data centers, edge devices and services are managed outside the typical management sphere. Platforms are pushed outside the data center, devices are spread across huge areas in inaccessible locations, and applications run on demand closer to the data. Scaling edge computing and Internet of Things (IoT) deployments represent a challenge that can’t be solved without a high degree of automation. Swiss Federal Railways (SBB), for example, planned to invest close to US$1 billion annually in new and modernized trains to create a smart, safe, and highly efficient rail network. Those new trains will include intelligent features such as dynamic LED information displays, digital seat booking systems, CCTV safety monitoring, and Wi-Fi access. Ranked among the world’s best railway operators, SBB sought an open source management platform to establish an IT infrastructure that could centrally manage all of the intelligent devices across its rail network. Red Hat Enterprise Linux (RHEL) provided a stable, reliable foundation for scaling existing applications and adopting emerging technology. SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe In that Red Hat environment, SBB utilizes Red Hat Ansible Automation to automate complex deployments and centrally control its IT infrastructure through a visual dashboard with features such as role-based access, scheduling, integrated notifications, and graphical inventory management. The company’s plan included connecting up to 300 trains by early 2020. By replacing a complex, manual configuration process with Ansible, SBB reduced configuration time for each train from 5 days to 3 hours. Eliminating the need for technicians to individually plug in a USB drive on each train, it can now manage updates by vehicle type—even while a train is in motion—and avoid any fleet-wide service impact. SBB had previously selected Red Hat OpenShift Container platform as part of its IT modernization program. Container-based architectures provide flexibility over traditional monolithic infrastructure, as applications are packaged into a container with only the required operating system components. This increases flexibility and portability across architectures and also allows for increased scalability as business demands evolve. Containerized applications help simplify edge architectures. They can be deployed across different infrastructures without changes, and containerized microservices developed for one project can be reused in others. But as organizations scale up, trying to manage thousands of clusters of containers can quickly overwhelm human capabilities. “If you are dealing with maybe 700 servers, maybe that’s ok, but once you hit 7,000 or 70,000 that are a lot of things you can’t do manually,” says, Ben Cohen, senior product marketing manager for cloud platforms at Red Hat. “So, you need automation to do remote upgrades and maintenance, for example.” Many enterprises are leveraging Kubernetes, an open-source system for automating deployment, scaling, and management of containerized applications, to manage hybrid cloud and multicloud deployments. Ansible and Red Hat OpenShift make the hard tasks in Kubernetes easier through automation and orchestration. Red Hat OpenShift fully supports Kubernetes Operators, which help simplify deployment, management, and operations of stateful applications. Operators are an application-specific controller that extends the Kubernetes API to create, configure, and manage instances of complex stateful applications on behalf of a Kubernetes user. Kubernetes Operators can be written in Ansible, leveraging the Ansible YAML language to encode human operational knowledge to automate Kubernetes environments. Connecting Red Hat Ansible workloads to Red Hat OpenShift breaks down the silos between traditional servers and cloud-native clusters. Red Hat sees edge differently. See how: https://www.redhat.com/en/topics/edge-computing/approach?sc_cid=7013a000002w1CwAAI Related content brandpost The Beauty of Edge Computing The intersection of edge computing and cloud is creating new ways to leverage distributed and centralized computing. By Pete Bartolik Jun 04, 2021 8 mins Edge Computing brandpost Building the Edge with Community-powered Innovation Community-powered innovation can support the need for edge computing architectures. By Pete Bartolik May 19, 2021 4 mins Technology Industry brandpost Securing the Edge – Not Without Challenges Edge deployments represent a large number of moving parts to track and secure. By Pete Bartolik May 19, 2021 4 mins Technology Industry brandpost Edge Requires Interoperability The edge computing stack must support multiple elements that all work together. By Pete Bartolik May 19, 2021 4 mins Technology Industry Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe