Tests conducted in the Dell EMC HPC and AI Innovation Lab show that software can be virtualized in a containerized environment with no significant performance penalties. Credit: Dell EMC Containerization simplifies the task of managing and distributing software. It bundles up applications and all their software dependencies in portable packages that can be moved easily from system to system. This approach to software distribution simplifies IT operations. Software development teams can put all the pieces and parts together into a package that is ready to run on compatible hardware. IT administrators can focus on the infrastructure that will run the containerized application, without considering software issues. Those are the upsides of containerization. But then there is a question about the potential for performance penalties that come with containerized applications. In the IT world, there is a perception that abstraction can lead to degradation of performance. We put this perception to the test in the Dell EMC HPC and AI Innovation Lab. And, to cut to the chase, our tests showed that software can be containerized with no significant performance penalties. The tests For our tests, we used the Dell EMC Ready Solution for AI – Deep Learning with Intel. This CPU-based scale-out solution provides a flexible platform for training a wide variety of neural network models with different capabilities and performance characteristics. The platform utilizes Nauta, an open source deep learning training platform built on cloud native technologies, such as Docker and Kubernetes. Nauta provides a simplified software environment that can be easily customized to suit whatever requirements the data scientist has. We measured and analyzed the performance of this solution using three different deep learning training use cases: Image classification using convolutional neural networks Language translation using multi-head attention networks Product recommendation using restricted Boltzmann machines We chose the computational and workload diversity of these use cases to highlight the flexibility of the solution for applications across different customer segments and problem types. In all three use cases, our tests demonstrated near-linear scaling in performance up to the full size of the solution. We encountered no performance penalties. Additional tests we performed on analogous hardware in the Dell EMC Zenith cluster in our lab showed that the solution can scale all tested use cases beyond 16 compute nodes. These tests confirmed that IT organizations can scale the solution as their compute requirements grow, without taking a performance hit. Key takeaways In our lab tests, we demonstrated that — in addition to greater flexibility for the data scientist who is training models — the use of containers does not adversely affect the performance of the solutions we examined. In fact, we even found that, in some cases, organizations can expect better performance from containerized workloads on the solution than they could expect from the same hardware deployed in a bare metal configuration. So, where do we go from here? Our successful tests suggest that we can explore the use of containers for other performance-critical use cases, such as high performance computing and financial transactions. And we can think more broadly about containers to answer questions like Can we run parallel applications inside containers? and Can we use containers to achieve simplified use and management across the computing spectrum? These questions are worth pursuing as we go forward into a world where containers are sure to be used more broadly. To learn more For a fuller look at this story, see the white paper “Dell EMC Ready Solutions for AI – Deep Learning with Intel: Measuring performance and capability of deep learning use cases.” For a look at independent third-party findings, read the ESG technical validation of the Dell EMC Ready Solutions for AI – Deep Learning with Intel. For a high-level look at Nauta software, see my blog “Simple, Scalable, Containerized Deep Learning using Nauta.” Lucas Wilson, Ph.D., is an artificial intelligence researcher and lead data scientist in the HPC and AI Innovation Lab at Dell EMC. Related content brandpost Solving 3 key IT challenges to unlock business innovation Dell and Microsoft are integrating strengths to help organizations unlock innovation with cloud-like agility across on-premises, edge, and cloud environments. By Vikram Belapurkar, Product Marketing, Multicloud, and Software-defined Infrastructure Platforms, Dell Technologies May 23, 2023 4 mins Hybrid Cloud brandpost How to Make the Quantum (Computing) Leap Three steps to start deploying quantum computing applications. By Mike Robillard, Senior Distinguished Engineer, Office of the CTO, Dell Technologies and Victor Fong, Distinguished Engineer, Office of the CTO, Dell Technologies May 08, 2023 7 mins Digital Transformation brandpost 8-10x performance upticks in next-gen infrastructure enable AI workloads New infrastructure solutions from Dell Technologies provide 8-10x improvement in performance metrics, advancing AI development and deployment at speed and scale. By Ihab Tarazi, SVP/Chief Technology Officer, Dell Technologies May 05, 2023 5 mins Artificial Intelligence brandpost What is the right connectivity choice for your enterprise edge? A Q&A discussion Understand how edge computing is transforming business performance By Bill Pfeifer and Stephen Foster Apr 28, 2023 9 mins Edge Computing Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe