A-Z Index | Directory | Careers

Computational Cosmology Center (C3) Launched

October 27, 2007

SetWidth230-C3-launch-party.gif

C3 founders Julian Borrill (left) and Peter Nugent (right) celebrate the launch of the research center with Horst Simon (middle), who is the Associate Director for Computing Sciences at Berkeley Lab.

Computational science and physics researchers gathered this month to celebrate the launch of the Computational Cosmology Center (C3), which further cements a decade-long collaboration between two Berkeley Lab divisions for studying dark energy and other mysteries of the universe.

The two leaders of the center, Julian Borrill and Peter Nugent, jointly presented their groups’ research to both the Physics and Computational Research divisions in October. Overall the Center currently comprises six researchers from the two divisions and UC Berkeley.

"Bringing a team of computational scientists from different divisions together in a single office area anticipates what we are planning to do with the new Computational Research and
Theory building. The formation of the center will increase productivity and create new scientific opportunities,” said Horst Simon, Associate Lab Director for Computing Sciences at Berkeley Lab.

The creation of C3 comes at a time when researchers around the world are gearing up to analyze data from the European Space Agency’s Planck satellite, due to launch in the summer of 2008, and begin work on the Joint Dark Energy Mission (JDEM), recently endorsed by the National Academy’s Beyond Einstein Program as the top priority for the next generation space mission for NASA and DOE.

Planck, a joint ESA/NASA mission, will provide the most detailed observations to date of the Cosmic Microwave Background (CMB), the remnant radiation from the Big Bang that fills the universe. Studying the tiny fluctuations in the CMB temperature and polarization will enable researchers to determine the fundamental properties of the universe with unprecedented precision.

The first detection of these fluctuations by the COBE satellite in 1992 earned Berkeley Lab’s George Smoot a share of the 2006 Nobel Prize in Physics. Smoot was also the principal investigator of the Laboratory Directed Research and Development (LDRD) project that brought Borrill to NERSC in 1997 specifically to develop the high performance computing tools needed to analyze CMB data sets like Planck’s.

The study of Type Ia supernovae marks the other key area of research by the C3 group. Researchers’ ability to calibrate their brightness to just a few percent provides an excellent yardstick for measuring distances across the universe. These supernovae are fundamental in measuring the expansion history of the universe and are utilized in all the proposed JDEM missions to date. Researchers first found proof that the expansion of the universe is accelerating — not slowing down as many experts had believed — by poring over data of far-away Type Ia supernovae.

Nugent was on one of the two international teams, led by Saul Perlmutter of Berkeley Lab, that announced the discovery independently in 1998. Nugent also is a member of the SciDAC Computational Astrophysics Consortium, which is funded by the DOE Office of Science to develop scientific computing software for carrying out large-scale research of supernovae, gamma ray bursts and nucleosynthesis.

“High performance computing will be critical for the simulations and data analysis that will be needed to understand dark energy,” said Borrill. Using the new capabilities afforded by NERSC’s recently acquired Franklin supercomputer, Borrill and colleagues have recently performed the first simulation and analysis of a year of data from all of Planck’s detectors in an effort to test different codes and ready them for analyzing the real data later.

Nugent has been working on the Nearby Supernova Factory data analysis as part of his SciDAC collaboration. As a byproduct of this work, he has assembled all the historical imaging taken at the Palomar Oschin Schmidt telescope over the past seven years that has been used for hunting for supernovae. The entire 60-terabyte dataset creates both a temporal and static catalog of astrophysical objects.

“I was in the unique position of utilizing my expertise in astrophysics imaging and linear algebra coupled with my parallel processing knowledge to work this through on the NERSC machines,” Nugent said.

Nugent’s work has attracted the attention of many groups and will form a very useful dataset for the entire astrophysical community.

Aside from Nugent and Borrill, other C3 members are Chris Cantalupo, Ted Kisner, Rollin Thomas and Sebastien Bongard. Thomas was recently named a Luis Alvarez Computational Science Fellow.


About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.