CERN's Search for God (Particles) Drives Massive Storage Needs

Think your storage headaches are big? Try being the guy in charge of storing the 1GB of data per second every day for a month coming off CERN's large hadron collider (LHC).

By Laurianne McLaughlin
Fri, July 20, 2007

CIO

Maybe you last read about CERN (the European Organization for Nuclear Research) and its massive particle accelerators in Angels & Demons by Dan Brown of The Da Vinci Code fame. In that book, the lead character travels to the cavernous research institute on the border of France and Switzerland to help investigate a murder. In real life, one of CERN's grisliest problems is finding storage for the massive amounts of data derived from its four high-profile physics experiments making use of the institute's large hadron collider (LHC). Due for operation in May 2008, the LHC is a 27-kilometer-long device designed to accelerate subatomic particles to ridiculous speeds, smash them into each other and then record the results.

The LHC experiments will study everything from the tiniest forms of matter to the questions surrounding the Big Bang. The latter subject provided Pierre Vande Vyvre, a project leader for data acquisition for CERN, with a particularly thorny challenge: He had to design a storage system for one of the four experiments, ALICE (A Large Ion Collider Experiment). It's one of the biggest physics experiments of our time, boasting a team of more than 1,000 scientists from around the world.

For one month per year, the LHC will be spitting out project data to the ALICE team at a rate of 1GB per second. That's 1GB per second, for a full month, "day and night," Vande Vyvre says. For this month, that data rate is an entire order of magnitude larger than each of the other three experiments being done with the LHC. In total, the four experiments will generate petabytes of data.

CERN believes that the LHC will let scientists re-create how the universe behaved immediately after the Big Bang. At that time, everything was a "sort of hot dense soup...composed of elementary particles," the project's webpage explains. The LHC can trigger "little bangs" that let ALICE scientists study how the particles act and come together, helping answer questions about the actual structure of atoms.

"The data is what the whole experiment is producing," Vande Vyvre says. "This is the most precious thing we have.”

Vande Vyvre is charged with managing the PCs, storage equipment, and custom and homegrown software surrounding the ALICE project's data before it hits the data center and gets archived. The ALICE group's experiments will start running in May 2008, but the storage rollout began in September 2006.

The ALICE experiment grabs its data from 500 optical fiber links and feeds data about the collisions to 200 PCs, which start to piece the many snippets of data together into a more coherent picture. Next, the data travels to another 50 PCs that do more work putting the picture together, then record the data to disk near the experiment site, which is about 10 miles away from the data center. "During this one month, we need a huge disk buffer," Vande Vyvre says.

Continue Reading

Our Commenting Policies