In March last year, scientists at the Queensland Brain Institute (QBI) discovered a non-invasive ultrasound technology that can be used to treat Alzheimer’s disease. It is a drug-free approach that breaks apart the ‘neurotoxic amyloid plaques’ that result in memory loss and cognitive decline.
Scientists making these medical breakthroughs at the QBI – one of the largest neuroscience organisations in the world – rely heavily on tech infrastructure to access, store and manage large data sets from brain imaging and microscopy devices.
“We are constantly trying to keep ahead of the curve – upgrading infrastructure to make it faster to get to a point where we are in front rather than people complaining about the compute or storage platform being slow. We want zero friction,” Jake Carroll, senior information technology manager (research) at QBI told CIO Australia.
“But we find that with every increase in research capability that we put into the organisation, a new instrument or high throughput technology or a new modality in scientific analysis pops up and finds a way to confound every bit of bandwidth that we have provisioned.”
Scientific computing has ‘scary frontiers’ and every day Carroll said he’s amazed by the fact users can “break a 100Gb/s pipe or flatten every CPU that’s thrown at them.”
QBI has become the first Australian organisation to roll out Brocade G620 fibre channel switches to form a fully redundant storage network fabric with 32Gb/s links – replacing older models that provided only 8Gb/s data throughput. The new switches can be combined to a 128Gb/s frame-based trunk to deal with the organisation’s demanding data flows.
The new fibre channel fabric is part of an integrated storage solution alongside HDS high performance ‘all flash’ arrays and the Oracle Hierarchical Storage Manager to control data flow between storage layers.
The storage infrastructure will eliminate performance barriers and provide scientists with seamless access to the data they need to carry out research. It also provides appropriate governance and preservation layers – relating to reproducibility and immutability.
“In this particular case, I was replacing a large disk array which was the core of a high performance computing home directory structure, which is a store for a genomic data among other things,” said Carroll.
QBI has a whopping 8.5 petabytes (PB) of unstructured machine data that comes out of scientific research instruments and “hasn’t been put into a database or platform where it is curated yet,” says Carroll.
“This is just raw data, it doesn’t include administrative data or anything to that effect,” he said.
QBI has around 10PB of data stored on tape, a storage technology that Carroll said it not going away any time soon.
“We really needed to up the ante on our fibre channel infrastructure so we could cope, noting that the ‘all flash array’ market – geared to high performance and low latency – was really starting to come of age,” he said.
“This is the right move at the right time because it gives us a few years of runway now for extremely high I/O scenarios for both extremely low latency media and extremely fast tape serial I/O.”
Carroll added that there of a cohort in the industry ‘banging on with the mantra that fibre channel and tape is dead.”
“I’ve got frank and carefully placed news for the individuals who believe that. In scientific computing or parts of the market where availability, latency and quality of service matter, it’s just not true.”
“We really do care about our quality of service and our frames arriving when they should arrive and with the latency that we expect. But further, we still can’t see from a TCO perspective that tape is dead. We just can’t see it in our world.
“No matter what we see and the rhetoric we are hearing from the disk vendors of the world, tape still has a big place in research computing.”