With millions of data points and large multidimensional datasets, 2D visuals on a single screen only go so far. To help researchers and data scientists intuitively explore huge and complex data in great detail, a couple of software developers from UTS built a 3D, 360-degree data visualisation room.\nThis room, called the Data Arena, officially opened yesterday and is housed in the revamped Faculty of Engineering and IT, with six projectors placed around curved walls that form a round drum shape.\nRead: CIO Christine Burns talked about the Data Arena as part of the university\u2019s $1 billion buildings upgrade when it was in its early stages\nOne researcher who relies heavily on data visualisations is Dr Jaime Valls Miro, associate professor at the university\u2019s Centre for Autonomous Systems. He is assessing the condition of water pipes to predict when they are likely to break and cause costly damages. Sydney Water is the main stakeholder backing the research project.\n\u201cIt\u2019s very difficult to be able to appreciate how a hard crack would develop, for example. In the space you can actually see much larger than real size how that is happening. It gives us more of an understanding of what is happening with real assets, because if you look at just the raw data you cannot interpret that in the same kind of way, our brain doesn\u2019t work that way.\n\u201cI think it is one step up in the sense our brain doesn\u2019t work in 2D, it works in 3D. It allows us to relate to what our brain understands better. I was able to appreciate better the space between the inner wall and the outer wall and see calibration effects there, which is not easy to visualise in a 2D world,\u201d he says.\nUsing a high definition laser scanner, 32 million data points are collected on one water pipe alone. The data is inputted into Houdini software \u2013 used for visual effects in feature films \u2013 which uses geometry to turn numbers into pictures, with GPU power supporting the visuals being generated in real time.\nValls Miro can go into the fine details to see how cracks and holes in the pipes have formed and examine the thickness or wearing down of the pipe. That assessment is then combined with other environmental factors that cause corrosion such as how the material of the underground pipe is affected by surrounding soil and the daily water load and pressure to predict when it is likely to fail.\n\u201cThese millions of data points that we are talking about, it is just not feasible to sit at a computer screen to see to the detail that you require. Visualising it in this way allows us to get a much better understanding,\u201d he says.\nUsing a hand-held controller, Valls Miro can rotate, turn, zoom in and out as much as he likes when assessing a pipe. Behind the 360 screen, the data and its geometric form is constantly being computed.\n\u201cAll these millions of points are being re-computed in real time. So when you rotate a little bit to the left, all the datasets are rotating, and are all integrated with the same pipeline in 3D and in 360 degrees.\u201d\nBen Simons, one of the developers of the Data Arena, said using the Houdini software is what makes the Data Arena unique. Drawing on his experience working on the computer animated feature film, Happy Feet Two, he saw an opportunity to bridge geometric capability of the software with high performance computing to handle large datasets.\n\u201cWhat we are doing is we are taking that silo of capability in the visual effects industry and the high performance computing, real time GPU and we are building a bridge between the two. And that bridge is unique in our Data Arena, no one else is doing that,\u201d he said.\nThe Data Arena is supported by nine NVIDIA Quadro K6000 GPUs, with 27,000 CUDA parallel processing cores \u2013 a huge amount of power.\nOpen source software Equalizer is also used to do parallel rendering and work load balancing so it can spread jobs to other GPUs.\n\u201cIts heritage comes from Silicon Graphics Inc. (SGI), which used to be the leader in computer graphics. And all feature films used to be on SGI,\u201d he added on the software.\nSimons also used field programmable gate arrays (FPGA) to ensure images are aligned to a grid and don\u2019t warp or distort. In addition, \u2018edge blending\u2019 was applied between each projector to make sure the visualisations look like they spread seamlessly across them all.\nThis has proven to be challenging, however, Simons said.\n\u201cYou start with projector one and you fix the blending for projector two and then three, four, five six, and once you do six you are back at one. So you\u2019re like \u2018oh no\u2019 because now you have to move one which affects two.\u201d\nSimons is developing a desktop version of the Data Arena, so that researchers can do quick visual mocks up in their offices or homes before plugging their USB stick into the Arena for analysing further in the 360 degree lab.\n\u201cYou will be able to run a Data Arena virtual machine on a Mac, PC and Linux machine and maybe even on Android. It\u2019s not going to draw as well, it\u2019s not like you have the GPUs that we have here. But you can do a mock up so you can get confidence that this is the bit of my data I want to see, this is the filtering I want to try.\u201d\nThis will be released as open source over the coming months on the Data Arena website, which is being developed. Simons said open sourcing this technology to the community is only going to help speed progress in research, which the university supports.\n\u201cYou want technological adoption, you want everyone using that. It\u2019s like a curve called the Metcalfe curve [Robert Metcalfe, who invented Ethernet], and the idea is I don\u2019t want a phone unless everyone has a phone, as there\u2019s no point in only me having one unless I can call you.\n\u201cSo by releasing software for free it allows us to get up into that technological adoption curve. It\u2019s to our benefit if everyone is using this software,\u201d Simons said.