Why containers will drive transformations in the 2020s

BrandPost By Robert Christiansen
Apr 29, 2020
ContainersIT Leadership

containertransformation
Credit: Shutterstock

Every decade or so, a new IT infrastructure model barges onto the scene and changes the way organizations use technology. In the 1990s, client/server architectures put computing resources in the back room and doled them out as needed. In the 2000s, virtual machines (VMs) created the ability to emulate one computer’s resources on another. In the 2010s, cloud hit big, helping companies become more agile and cost focused.

Now that we’ve entered a new decade, what model will dominate the conversation? Based on current trends and expert forecasts, it’s clear that the 2020s will be defined by containers and microservices.

What exactly are containers?

Getting containers right is critical to success in any transformational journey. Given the impact the technology is expected to have over the next decade, it’s worth looking more closely at what containers are and why they’re becoming so popular.

Anybody involved in enterprise IT has at least a passing knowledge of containers. Like their physical counterparts, these virtual operating system configurations pack items away for future use. They contain all the executables an IT team needs to run everything from a small microservice (like a single HTTP endpoint) to a much larger application (like a payroll program). Each container has its own binary code, libraries, and configuration files – but it doesn’t contain any operating system images. That makes containers lighter and easier to transport than applications in traditional hardware or VM environments.

3 key benefits of containers

Containers offer a wide variety of benefits – most notably are speed, choice, and the ability to optimize based on the situation.

• The need for speed Speed, of course, is critical in today’s IT world. Moving software through all the various stages of development improves efficiency, increases productivity, and allows more time for testing and quality control. Fast processes enable firms to get to market faster and update more frequently. That’s the name of the game.

Using containers, your teams can speed up delivery in two ways. First, because VMs contain entire operating systems, they take longer to boot up each time they’re used. Containers don’t need to boot up; the operating system is already there.

Second, teams using containers can release software in smaller segments than they can in legacy waterfall processes. Containers eliminate those pieces of software that stand between the application’s execution and the actual hardware that performs the task at hand. This construct provides you with purpose-built hardware, which serves the app alone.

For example, let’s say you have an AI app and want to use a graphics processing unit (GPU). Ideally, you want to use that GPU as effectively as possible. The more software between the GPU and the orchestration function, the less effectively it will work. Stripping away unnecessary software gives you a higher density of containers per computer and better utilization for that system, thus increasing the processing speed for that particular use case.

• The importance of platform choice No one likes vendor lock in. Developers and IT operators alike want to pick the best platform to run their apps, which means they need the ability to migrate apps from platform to platform with little friction. Containers enable choice. At their heart, a container holds all the software needed to run the app – in the container. That’s why they call them containers!

This technology is important to a multi-cloud or hybrid cloud strategy because it gives you more choices. And choice matters when you are dealing with issues such as data privacy and data residency. Also, if you must run an application close to an edge device, you need to have choice about where you can place it.

Having a container strategy gives you choice. Kubernetes appears to be the accepted platform that’s being adopted across the industry today; it enables people to run any app that supports the Kubernetes APIs – that gives users choice and flexibility. Three years from now, you may want to run your apps on another cloud or on-prem. If you selected an open source standard like Kubernetes, the likelihood you’ll be able to move is very high. Open standards, supported through open source comminutes, give you that flexibility. Developing a solution container strategy is the first step.

• The value of optimization A decent amount of debate concerns whether to use containers with VMs or in place of VMs. In my opinion, running containers separately enables organizations to take advantage of all the lightweight features and optimize their environments for success. You can pack more applications on a host computer using containers in place of VMs. Conversely, running containers inside a VM is like trying to attach a horse to the front of a car. Why take something this is optimized for speed and then go backward in technology to run it inside a VM?

A container strategy should bridge both the public and the private areas. Containers enable you to make a step forward in your goal of collapsing the technology stack and eliminating unneeded weight that typically comes with VMs.

Now is the time

The technology driving transformations in the 2020s is clearly containers. They offer a host of benefits for organizations seeking speed, choice, and flexibility. As this decade begins, now is the time to start building out what your new ecosystem looks like – both in the data center and in the cloud. A vital piece of your core infrastructure should include containers.

Original article first published on Hewlett Packard Enterprise’s digital publication “Enterprise.nxt”. Reproduced with permission.

____________________________________

About Robert Christiansen

robert christiansen
Robert Christiansen is a key executive in the CTO Office of Hybrid IT at Hewlett Packard Enterprise, setting the strategy and evangelizing the company’s vision. Hybrid IT is a $25B group, and central to HPE’s core technologies. In this role, Robert spends his time with key global clients and partners, deepening the relationships and aligning the joint technology efforts to improve the way people live and work.
Robert is a contributing writer for CIO, Forbes, TechTarget, and numerous industry magazines and is a major contributor to The Doppler, the cloud industry’s thought-leadership publication. He is also a keynote speaker at numerous technology and HPE led events, clearly articulating technology shifts while having a great time doing it!