Every decade or so, a new IT infrastructure model barges onto the scene and changes the way organizations use technology. In the 1990s, client\/server architectures put computing resources in the back room and doled them out as needed. In the 2000s, virtual machines (VMs) created the ability to emulate one computer\u2019s resources on another. In the 2010s, cloud hit big, helping companies become more agile and cost focused.\nNow that we\u2019ve entered a new decade, what model will dominate the conversation? Based on current trends and expert forecasts, it\u2019s clear that the 2020s will be defined by containers and microservices.\nWhat exactly are containers?\nGetting containers right is critical to success in any transformational journey. Given the impact the technology is expected to have over the next decade, it\u2019s worth looking more closely at what containers are and why they\u2019re becoming so popular.\nAnybody involved in enterprise IT has at least a passing knowledge of containers. Like their physical counterparts, these virtual operating system configurations pack items away for future use. They contain all the executables an IT team needs to run everything from a small microservice (like a single HTTP endpoint) to a much larger application (like a payroll program). Each container has its own binary code, libraries, and configuration files \u2013 but it doesn\u2019t contain any operating system images. That makes containers lighter and easier to transport than applications in traditional hardware or VM environments.\n3 key benefits of containers \nContainers offer a wide variety of benefits \u2013 most notably are speed, choice, and the ability to optimize based on the situation.\n\u2022 The need for speedSpeed, of course, is critical in today\u2019s IT world. Moving software through all the various stages of development improves efficiency, increases productivity, and allows more time for testing and quality control. Fast processes enable firms to get to market faster and update more frequently. That\u2019s the name of the game.\nUsing containers, your teams can speed up delivery in two ways. First, because VMs contain entire operating systems, they take longer to boot up each time they\u2019re used. Containers don\u2019t need to boot up; the operating system is already there.\nSecond, teams using containers can release software in smaller segments than they can in legacy waterfall processes. Containers eliminate those pieces of software that stand between the application\u2019s execution and the actual hardware that performs the task at hand. This construct provides you with purpose-built hardware, which serves the app alone.\nFor example, let\u2019s say you have an AI app and want to use a graphics processing unit (GPU). Ideally, you want to use that GPU as effectively as possible. The more software between the GPU and the orchestration function, the less effectively it will work. Stripping away unnecessary software gives you a higher density of containers per computer and better utilization for that system, thus increasing the processing speed for that particular use case.\n\u2022\u00a0The importance of platform choiceNo one likes vendor lock in. Developers and IT operators alike want to pick the best platform to run their apps, which means they need the ability to migrate apps from platform to platform with little friction. Containers enable choice. At their heart, a container holds all the software needed to run the app \u2013 in the container. That\u2019s why they call them containers!\nThis technology is important to a multi-cloud or hybrid cloud strategy because it gives you more choices. And choice matters when you are dealing with issues such as data privacy and data residency. Also, if you must run an application close to an edge device, you need to have choice about where you can place it.\nHaving a container strategy gives you choice. Kubernetes appears to be the accepted platform that\u2019s being adopted across the industry today; it enables people to run any app that supports the Kubernetes APIs \u2013 that gives users choice and flexibility. Three years from now, you may want to run your apps on another cloud or on-prem. If you selected an open source standard like Kubernetes, the likelihood you\u2019ll be able to move is very high. Open standards, supported through open source comminutes, give you that flexibility. Developing a solution container strategy is the first step.\n\u2022\u00a0The value of optimizationA decent amount of debate concerns whether to use containers with VMs or in place of VMs. In my opinion, running containers separately enables organizations to take advantage of all the lightweight features and optimize their environments for success. You can pack more applications on a host computer using containers in place of VMs. Conversely, running containers inside a VM is like trying to attach a horse to the front of a car. Why take something this is optimized for speed and then go backward in technology to run it inside a VM?\nA container strategy should bridge both the public and the private areas. Containers enable you to make a step forward in your goal of collapsing the technology stack and eliminating unneeded weight that typically comes with VMs.\nNow is the time\nThe technology driving transformations in the 2020s is clearly containers. They offer a host of benefits for organizations seeking speed, choice, and flexibility. As this decade begins, now is the time to start building out what your new ecosystem looks like \u2013 both in the data center and in the cloud. A vital piece of your core infrastructure should include containers.\nOriginal article first published on Hewlett Packard Enterprise\u2019s digital publication \u201cEnterprise.nxt\u201d. Reproduced with permission.\n____________________________________\nAbout Robert Christiansen\n\nRobert Christiansen is a key executive in the CTO Office of Hybrid IT at Hewlett Packard Enterprise, setting the strategy and evangelizing the company's vision. Hybrid IT is a $25B group, and central to HPE's core technologies. In this role, Robert spends his time with key global clients and partners, deepening the relationships and aligning the joint technology efforts to improve the way people live and work.\nRobert is a contributing writer for CIO, Forbes, TechTarget, and numerous industry magazines and is a major contributor to The Doppler, the cloud industry\u2019s thought-leadership publication. He is also a keynote speaker at numerous technology and HPE led events, clearly articulating technology shifts while having a great time doing it!