Being a chief information officer or IT manager can be difficult, but it\u2019s rarely boring. Given the accelerating pace of technological change, as soon as these professionals become comfortable with some cutting-edge technology, another compelling advance demands their attention.\nFor many IT leaders, containers have become one of the top items on their \u201cto-do\u201d lists. These standards-based code wrappers emerged from the open source community a few years ago, and have experienced rapid maturation and increasing adoption ever since.\nContainers promise a range of benefits, including many related to public cloud deployments. But the technology is new enough that it can still be quite challenging for many companies to fully exploit its potential.\nUnlike virtual machines, which run on top of a hypervisor layer and contain their own operating system instances, applications, and support libraries, containers can be much more compact and faster executing. In part, that\u2019s because containers don\u2019t require their own OS or a hypervisor. Instead, they use the host\u2019s OS to wrap application code, runtimes, tools, and libraries into isolated units that can be easily moved and deployed on different platforms and in different environments.\nAmong the companies that have already adopted containers, there is evidence of both enthusiasm and caution. In early 2017, Forrester Research surveyed nearly 200 such organizations across \u00a0several countries, and found that 63% already had more than 100 containers deployed. Two years from now, 82% expected to have more than 100 containers in use.\nThese early adopters were also exploiting one of containers\u2019 key features \u2013 their portability. Among the survey respondents, 82% had deployed containers in private clouds, 53% in public clouds, and 36% on traditional infrastructure. Among the many benefits cited were increased speed, improved security, and a consistent deployment process.\nThe container landscape is starting to gel and fill in gaps in standards, management systems, orchestration tools, and other advances that will address many of the technology\u2019s most pressing needs. The major public cloud providers have also rushed to implement support for container-based applications and microservices, recognizing the many synergies that exist between containers and the cloud.\nIn one notable example of this synergy, Densify, a \u00a0predictive analytics and cloud optimization service, is helping organizations reduce their costs in public clouds by using containers. Densify\u2019s analytics figure out how to stack application workloads in such a way that reduces resource contention while increasing density.\u00a0 They have helped organizations figure out how to use containers to host multiple application workloads within a single AWS instance.\u00a0\nTo illustrate the potential of this strategy, they shared a case study in which they analyzed the utilization patterns and requirements of 983 workloads, each running in an independent Amazon Web Services (AWS) instance. Thanks to its workload stacking analytics, Densify found that it could host 983 containers in extra-large AWS instances. By moving these workloads from 983 AWS instances to containers stacked in 32 extra-large instances, the 1-year hosting cost dropped from $1.89 million to $325.3 thousand, a net savings of 82%.\nAs more organizations grow comfortable with container technology, and as they see others realizing the types of savings to be gained with these advanced use cases , the steady move toward containers will likely become a stampede. But creating, deploying, and optimizing containers on a broader scale will demand that companies have deep visibility into how these building blocks consume the underlying resources that they share.