Robust research programs now require robust high performance computing (HPC) environments. When an organization is running applications that span the domains of weather forecasting, engineering, chemistry, biology, and more, it needs HPC systems to churn through massive amounts of data to answer hard questions, gain new insights and make groundbreaking discoveries.
That’s the way it is at Texas Tech University (TTU), which works actively to provide its faculty and students with access to the HPC resources that are required for a Tier 1 research institution. These resources are delivered by the TTU High Performance Computing Center, which was established in 1999 to promote research and teaching on campus through the integration of leading-edge HPC and visualization for faculty, staff and students.
Today, the university’s HPC Center remains true to its original mission as it provides hundreds of researchers with access to world-class HPC resources. These resources include two main clusters — Quanah, with a benchmarked total computing power of 485 Teraflops/sec, and Hrothgar, with a total aggregate computing power of 166 Teraflops/sec — in addition to several specialty clusters and special-purpose resources.
These two clusters, both from Dell EMC, are designed to provide a natural stepping stone between capabilities available to researchers on their individual desktop, laptop and portable resources and those of highly competitive national and international-scale supercomputing resources.
The newest of the clusters, Quanah has 467 compute nodes with 36 cores each for a total of 16,812 cores, of which 16,092 are reserved for general use and 720 cores are owned by specific research groups. Commissioned in early 2017 and expanded to its current size later in that year, Quanah is based on Dell EMC PowerEdge C6320 servers. The compute nodes consist of dual-18-core Intel® Xeon® Broadwell processors (36 cores per node) with 192 GB memory per node.
Hrothgar is an earlier Dell Linux Cluster currently consisting of 630 total nodes and 8,246 total processing cores, of which 7,408 cores are made available for general use and the rest are owned by specific research groups. The Hrothgar cluster was initially built in 2011. Several updates have occurred since then, including the replacement of the core and leaf switches with QDR InfiniBand and updating of
approximately 100 of the nodes to newer Intel® Xeon® Ivy Bridge processors.
With the power of Quanah and Hrothgar, researchers at Texas Tech University are breaking new ground in a broad range of academic disciplines. As a Dell EMC case study notes, the clusters support applications in weather forecasting, climate science, chemistry and biology, as well as applications in many other domains. The system is open and accessible to faculty members, graduate students and collaborators from other institutions.
In an example of the diversity of the work that is under way at TTU, Tommy Dang, an assistant professor in the Department of Computer Science, is using the HPC resources for a wide range of research, from studies of virtual and augmented reality to big data visualization and visual analytics. With the help of the HPC clusters, Dang is developing methods and tools for visual analytics — an integrated approach that combines visualization, human factors and data analysis to derive insight from massive, dynamic and ambiguous data. This is the type of research that can be done only in a robust HPC environment.
A key takeaway here is that, through its investments in Quanah, Hrothgar and other HPC resources, TTU is making the power of HPC accessible to a broad community of researchers — and helping to keep the university’s Carnegie “Tier 1” research-focused institution classification.
To learn more
For a closer look at the TTU’s HPC environment, watch the Dell EMC video “Texas Tech University transforms testing and research capabilities.” And to learn more about the technologies that make it all happen, including Dell EMC Ready Solutions for HPC Research, explore Dell EMC solutions for HPC and AI.