by Rebecca Merrett

University of Adelaide switches on supercomputer to support researchers

Feb 15, 2016
Data MiningEducation Industry

The University of Adelaide has switched on Phoenix, a high performance computing (HPC) system that provides more than 4,300 of its researchers with the computing power they need to crunch large data sets.

The 300 teraflop HPC is based on Lenovo’s System x NeXtScale technology, has 15,360 gigabytes of memory and close to 4,000 cores. This supports researchers that require a large amount of computing power for machine learning to build autonomous vehicles, probability and predictive modelling for healthcare applications, and wave modeling for clean energy.

The system was up and running in early February, about six weeks after it was delivered.

“These systems don’t have a very long shelf life so you want to start using them as soon as possible,” said CIO of University of Adelaide, Mark Gregory. “Our team is about six or so people, two people from Lenovo helped out. We have energetic staff who put in a lot of hours.”

The supercomputer was partly crowdfunded, with about a dozen researchers from the university contributing funds to the project, Gregory said.

“They were really eager to get their coats on and start work,” he said.

The university’s more sophisticated users participate in national supercomputing projects but an in-house system made more sense for ‘mid-level’ users who have a lot of work to do but don’t have the relationships or time to go into a competitive grant for a supercomputer platform, said Gregory.

“They just want to do something really quickly,” he said.

Gregory said open source software was a key part of the HPC project.

“A lot of licensing models don’t work well when you get into high performance computing because they might be based on how many cores you have, or how fast your systems are, or other things. With high performance computing, those numbers are phenomenally high, so traditional licensing models don’t work very well,” he said.

“Also, most of the emerging stuff is being developed in the open source community. So the latest and greatest software around high performance computing is coming out of the open source world.

“There are downsides to open source as well, you trade off buying licenses with a lot more people and time in gluing things together. But that’s a well understood trade-off,” he said.

Gregory recently implemented an open source file storage system that stores nearly two petabytes of data used for research purposes.

“The file system that we are using comes from the open source community, as is the method that we use to partition the system and share it. So our allocation method is all open source, as well as the base software itself, which is effectively a variant of the Linux operating system,” he said.

Most of the university’s scientific software-based tools that researchers use are also open source, he said.

“We depend quite a bit on open source software, we depend quite a bit on looking at other high performance computing environments and talking to peers [who have similar] architecture environments.”

Gregory said universities have a history of being at the forefront of large-scale technology deployments. Some were first to take up supercomputing, to implement distributed computing and cloud technology, and were on the scene in the development of early-day computers in the 1940s and ‘50s. With this, there are a lot of opportunities for private sector companies to learn from their technology rollouts.

“Universities are also pretty open about how they build their environments, so companies can look to universities and they are likely to share quite a bit,” he said.

The University of Adelaide has more than 1,800 research staff and 2500 research students in more than 50 research centres.