A-Z Index | Directory | Careers

LBNL Team Takes Software to the End of the Earth

April 4, 2005

ice cube descending dom deep 1

A detector containing software developed a Berkeley Lab-led team is lowered into the Antarctice ice. Photo by Jim Haugen, IceCube/NSF.

Developing robust, reliable data acquisition software for high-energy physics experiments is always a challenge, but developing such software for an experiment expected to run for up to 15 years while buried in the Antarctic ice poses unique problems.

But a team led by Chuck McParland of CRD’s Distributed Scientific Tools Group has risen to the occasion. The first kilometer-long string of 60 detectors recently buried near the South Pole is already recording light pulses as the experiment seeks out evidence of neutrinos.

The experiment, an international undertaking by 26 institutes, is known as IceCube as the strings of detectors will cover a cubic kilometer of ice. The detectors, which will number 4,800 once installation is completed in 4-5 years, will serve as a telescope designed to study the high-energy variety of the ghostlike subatomic particles known as neutrinos. Originating from the Milky Way and beyond and traveling to Earth virtually unobstructed, these high-energy neutrinos serve as windows back through time and should provide new insight into questions about the nature of dark matter, the origin of cosmic rays, and other cosmic issues.

Berkeley Lab researchers were responsible for the unique electronics package inside the digital optical modules (DOMs) that will enable IceCube to pick out the rare signal of a high-energy neutrino colliding with a molecule of water. A DOM is a pressurized glass sphere the size of a basketball that houses an optical sensor, called a photomultiplier tube, which can detect photons and convert them into electronic signals that scientists can analyze.

Equipped with onboard control, processing and communications hardware and software, and connected in long strings of 60 each via an electrical cable, the DOMs can detect neutrinos with energies ranging from 200 billion to one quadrillion or more electron volts. In January and February, the first IceCube cable, with its 60 DOMs, was lowered down into a hole drilled through the Antarctic ice using jets of hot water. Plans call for a total of 80 strings of DOMS to be put in place over the next five years. The Antarctic summer season, during which the weather is “mild” enough for work on IceCube to proceed, lasts only from mid-October to mid-February. After that, winter sets in and the climate is much too harsh for any outdoor work.

While the operating environment offers a unique challenge, McParland said, “The biggest challenge is bridging the gap between the hardware side of the project and the software side.” The hardware developers needed some of the data acquisition software to test their systems, but that was just one aspect of the overall software project. “Getting the testing and the actual data acquisition components to all work together is a challenging job,” he said.

Adding to the difficulty was the push to engineer software that would support the 10- to 15-year lifespan of the experiment. “This means we had to put a lot more effort into the design of the software and use more modern programming practices than are typically used for other experiments of this sort,” McParland said. Most of the acquisition software is being designed by a team from LBNL, which includes Chris Day, Simon Patton, Akbar Mokhtarani, Artur Muratus, Keith Beattie and Martin Stoufer, Arthur Jones, David Hayes and John Jacobsen, and the University of Wisconsin, Madison, Penn State University and the University of Delaware.

The software is being developed in two pieces. The first piece goes “down the hole,” loaded on each of the DOMs. The system is so designed that the software in each module can be updated and, thanks to programmable logic, the circuit boards can also be reprogrammed to change the behavior of the detectors.

“Since the first string was installed we are already seeing events – cones of light that lit up all of the modules,” McParland said.

At the top of each detector string will be a PC, where the second piece of software will be installed. Together, the PCs will form a pyramid of processors, with each performing initial filtering of the data before sending them to a satellite for transmission to what will eventually be a 120-processor farm.

The flow of data is in the shape of wave forms describing each event. Each wave form is time stamped. This allows them to be compared, with accuracy to within five nanoseconds, so that researchers can see how and when a particle passed near each detector. Calibrating the system to allow this comparison required the team to develop a new protocol to stop all communication between the modules – a task beyond the capability of off-the-shelf protocols, McParland said.

Once the data have been collected and an interesting event found, scientists will actually trigger the experiment after the fact, McParland said. This is achieved by breaking apart the collected data and going back through it to find the wave forms depicting the event.

Another problem faced by the developers was dealing with impurities in the glass used in the spheres and photomultiplier tubes. As the impurities decay, they give off light pulses, which meant that the software had to filter out 50 to 100 times more incidental signals than actual scientific data. “There’s just lots of background stuff you have to get rid of,” McParland said.

IceCube will field about 20 times the detector power of its predecessor, another South Pole high energy neutrino telescope called AMANDA (Antarctic Muon And Neutrino Detector Array). McParland was also part of the AMANDA project, and spent five weeks at the South Pole as part of that effort.

While his group’s role in IceCube will begin to wind down in 2006, for the time being the work is both very demanding and proceeding very well, “McParland said.

“Overall it’s a very interesting computer science problem in that we have a distributed group of people working to put together a good software design with high quality code,” he neoted.


About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.