The University of Adelaide has switched on Phoenix, a high performance computing (HPC) system that provides more than 4,300 of its researchers with the computing power they need to crunch large data sets.The 300 teraflop HPC is based on Lenovo\u2019s System x NeXtScale technology, has 15,360 gigabytes of memory and close to 4,000 cores. This supports researchers that require a large amount of computing power for machine learning to build autonomous vehicles, probability and predictive modelling for healthcare applications, and wave modeling for clean energy.The system was up and running in early February, about six weeks after it was delivered. \u201cThese systems don\u2019t have a very long shelf life so you want to start using them as soon as possible,\u201d said CIO of University of Adelaide, Mark Gregory. "Our team is about six or so people, two people from Lenovo helped out. We have energetic staff who put in a lot of hours.\u201d The supercomputer was partly crowdfunded, with about a dozen researchers from the university contributing funds to the project, Gregory said. \u201cThey were really eager to get their coats on and start work,\u201d he said.The university's more sophisticated users participate in national supercomputing projects but an in-house system made more sense for 'mid-level' users who have a lot of work to do but don't have the relationships or time to go into a competitive grant for a supercomputer platform, said Gregory."They just want to do something really quickly,\u201d he said.Gregory said open source software was a key part of the HPC project. \u201cA lot of licensing models don\u2019t work well when you get into high performance computing because they might be based on how many cores you have, or how fast your systems are, or other things. With high performance computing, those numbers are phenomenally high, so traditional licensing models don\u2019t work very well," he said. \u201cAlso, most of the emerging stuff is being developed in the open source community. So the latest and greatest software around high performance computing is coming out of the open source world. \u201cThere are downsides to open source as well, you trade off buying licenses with a lot more people and time in gluing things together. But that\u2019s a well understood trade-off,\u201d he said. Gregory recently implemented an open source file storage system that stores nearly two petabytes of data used for research purposes. \u201cThe file system that we are using comes from the open source community, as is the method that we use to partition the system and share it. So our allocation method is all open source, as well as the base software itself, which is effectively a variant of the Linux operating system,\u201d he said. Most of the university\u2019s scientific software-based tools that researchers use are also open source, he said. \u201cWe depend quite a bit on open source software, we depend quite a bit on looking at other high performance computing environments and talking to peers [who have similar] architecture environments.\u201dGregory said universities have a history of being at the forefront of large-scale technology deployments. Some were first to take up supercomputing, to implement distributed computing and cloud technology, and were on the scene in the development of early-day computers in the 1940s and \u201850s. With this, there are a lot of opportunities for private sector companies to learn from their technology rollouts. \u201cUniversities are also pretty open about how they build their environments, so companies can look to universities and they are likely to share quite a bit,\u201d he said.The University of Adelaide has more than 1,800 research staff and 2500 research students in more than 50 research centres.