With millions of data points and large multidimensional datasets, 2D visuals on a single screen only go so far. To help researchers and data scientists intuitively explore huge and complex data in great detail, a couple of software developers from UTS built a 3D, 360-degree data visualisation room.
This room, called the Data Arena, officially opened yesterday and is housed in the revamped Faculty of Engineering and IT, with six projectors placed around curved walls that form a round drum shape.
Read: CIO Christine Burns talked about the Data Arena as part of the university’s $1 billion buildings upgrade when it was in its early stages
One researcher who relies heavily on data visualisations is Dr Jaime Valls Miro, associate professor at the university’s Centre for Autonomous Systems. He is assessing the condition of water pipes to predict when they are likely to break and cause costly damages. Sydney Water is the main stakeholder backing the research project.
“It’s very difficult to be able to appreciate how a hard crack would develop, for example. In the space you can actually see much larger than real size how that is happening. It gives us more of an understanding of what is happening with real assets, because if you look at just the raw data you cannot interpret that in the same kind of way, our brain doesn’t work that way.
“I think it is one step up in the sense our brain doesn’t work in 2D, it works in 3D. It allows us to relate to what our brain understands better. I was able to appreciate better the space between the inner wall and the outer wall and see calibration effects there, which is not easy to visualise in a 2D world,” he says.
Using a high definition laser scanner, 32 million data points are collected on one water pipe alone. The data is inputted into Houdini software – used for visual effects in feature films – which uses geometry to turn numbers into pictures, with GPU power supporting the visuals being generated in real time.
Valls Miro can go into the fine details to see how cracks and holes in the pipes have formed and examine the thickness or wearing down of the pipe. That assessment is then combined with other environmental factors that cause corrosion such as how the material of the underground pipe is affected by surrounding soil and the daily water load and pressure to predict when it is likely to fail.
“These millions of data points that we are talking about, it is just not feasible to sit at a computer screen to see to the detail that you require. Visualising it in this way allows us to get a much better understanding,” he says.
Using a hand-held controller, Valls Miro can rotate, turn, zoom in and out as much as he likes when assessing a pipe. Behind the 360 screen, the data and its geometric form is constantly being computed.
“All these millions of points are being re-computed in real time. So when you rotate a little bit to the left, all the datasets are rotating, and are all integrated with the same pipeline in 3D and in 360 degrees.”
Ben Simons, one of the developers of the Data Arena, said using the Houdini software is what makes the Data Arena unique. Drawing on his experience working on the computer animated feature film, Happy Feet Two, he saw an opportunity to bridge geometric capability of the software with high performance computing to handle large datasets.
“What we are doing is we are taking that silo of capability in the visual effects industry and the high performance computing, real time GPU and we are building a bridge between the two. And that bridge is unique in our Data Arena, no one else is doing that,” he said.
The Data Arena is supported by nine NVIDIA Quadro K6000 GPUs, with 27,000 CUDA parallel processing cores – a huge amount of power.
Open source software Equalizer is also used to do parallel rendering and work load balancing so it can spread jobs to other GPUs.
“Its heritage comes from Silicon Graphics Inc. (SGI), which used to be the leader in computer graphics. And all feature films used to be on SGI,” he added on the software.
Simons also used field programmable gate arrays (FPGA) to ensure images are aligned to a grid and don’t warp or distort. In addition, ‘edge blending’ was applied between each projector to make sure the visualisations look like they spread seamlessly across them all.
This has proven to be challenging, however, Simons said.
“You start with projector one and you fix the blending for projector two and then three, four, five six, and once you do six you are back at one. So you’re like ‘oh no’ because now you have to move one which affects two.”
Simons is developing a desktop version of the Data Arena, so that researchers can do quick visual mocks up in their offices or homes before plugging their USB stick into the Arena for analysing further in the 360 degree lab.
“You will be able to run a Data Arena virtual machine on a Mac, PC and Linux machine and maybe even on Android. It’s not going to draw as well, it’s not like you have the GPUs that we have here. But you can do a mock up so you can get confidence that this is the bit of my data I want to see, this is the filtering I want to try.”
This will be released as open source over the coming months on the Data Arena website, which is being developed. Simons said open sourcing this technology to the community is only going to help speed progress in research, which the university supports.
“You want technological adoption, you want everyone using that. It’s like a curve called the Metcalfe curve [Robert Metcalfe, who invented Ethernet], and the idea is I don’t want a phone unless everyone has a phone, as there’s no point in only me having one unless I can call you.
“So by releasing software for free it allows us to get up into that technological adoption curve. It’s to our benefit if everyone is using this software,” Simons said.