A-Z Index | Directory | Careers

NERSC Played Key Role in Nobel Laureate’s Discovery

NERSC, Berkeley Lab Now Centers for Computational Cosmology Community

October 4, 2011

By Jon Bashor
Contact: cscomms@lbl.gov

In the 1990s, Saul Perlmutter discovered that the universe is expanding at an accelerating rate. He confirmed his observational conclusions by running thousands of simulations at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory (Berkeley Lab). As a result of this groundbreaking work, Perlmutter was awarded the 2011 Nobel Prize in Physics. His research team is believed to have been the first to use supercomputers to analyze and validate observational data in cosmology. This melding of computational science and cosmology sowed the seeds for more projects, establishing Berkeley Lab and NERSC as centers for the emerging field.

The Supernova Cosmology Project

The Supernova Cosmology Project, co-led by Perlmutter, used a robotic telescope equipped with a digital detector instead of photographic plates. Its digital images were compared with earlier images using “subtraction” software. By 1994 the SCP team could discover supernovae “on demand,” and Perlmutter realized he would soon need more computing power to analyze the growing flow of data. NERSC’s move to Berkeley Lab in 1996 provided the perfect opportunity for his team.

With a Laboratory Directed Research and Development (LDRD) grant, the NERSC and Physics divisions jointly hired a postdoc. Peter Nugent—now leader of the NERSC Analytics Team and co-leader of Berkeley Lab’s Computational Cosmology Center (C3)—helped the group develop parallel algorithms that could run on 128 cores at once, taking great advantage of NERSC’s “Mcurie” system, a 512-processor Cray T3E-900 supercomputer.

Perlmutter (far left) with members of the Supernova Cosmology Project team

Perlmutter (far left) with members of the Supernova Cosmology Project team. Nugent, now group lead for NERSC Analytics and head of the Computational Cosmology Center at Berkeley Lab, stands center rear. (Roy Kaltschmidt, LBNL)


To analyze the data from 40 supernovae for errors or biases, Nugent simulated 10,000 exploding supernovae at varying distances under varying circumstances. These were then plotted and compared with the observed data to detect any biases affecting observation or interpretation. The Cray T3E supercomputer was also used to check and recheck their work by resampling the data and running calculations that helped determine the reliability of their measurements thousands of times.

These rigorous, supercomputer-powered analyses of potential biases reduced the uncertainties in the data and helped Perlmutter’s team win widespread acceptance of their conclusions in the scientific community.

Computational Cosmology Legacy

The story, however, doesn’t end there. Perlmutter’s work inspired a profusion of computational cosmology research centered at Berkeley and NERSC:

BOOMERANG, a 1999 balloon-based Cosmic Background Radiation survey, made close to one billion measurements of CMB temperature variations. Analysis of the BOOMERANG dataset at NERSC established that the Universe is flat— its geometry is Euclidean, not curved. Nearly all CMB experiments launched since have used NERSC for data analysis in some capacity, and today NERSC supports around 100 researchers from a dozen CMB experiments.

Planck, the European Space Agency’s satellite observatory is yielding 300 billion samples per year. The C3 team spent nearly a decade developing the supercomputing infrastructure for the U.S. Planck Team’s data and analysis operations at NERSC (and received a NASA Public Service Group Award for their efforts).

Nearby Supernova Factory (SNfactory), an international collaboration between several groups in the United States and France, collects detailed observations of low-redshift supernovae. By 2003 the SNfactory was discovering eight supernovae per month, a rate made possible by a high-speed data link, custom data pipeline software and NERSC’s ability to store and process 50 GB of data per night. Sunfall, a collaborative visual analytics system developed jointly by the Computational Research and Physics divisions,, has eliminated 90 percent of the human labor once involved in supernova identification and follow-up, while cutting false-positives by a factor of 10.

Palomar Transient Factory (PTF), an innovative sky survey, is the first project dedicated solely to finding “transient” astronomical events, including supernovae. A team at Caltech worked with NERSC to develop an automated system that sifts through terabytes of astronomical data every night to find interesting events. PTF has discovered more than 1,300 supernovae, three of which form new classes of these objects. Last month PTF discovered one of the closest Type Ia supernovae in the last 40 years, SN 2011fe, in the nearby Pinwheel galaxy.

Deep Sky Project is a Web-accessible collection of 11 million astronomical images (70 terabytes of data) housed at NERSC. The brainchild of Nugent, who was on Perlmutter’s SCP team, the database is made up of images taken over a decade and covering nearly all of the northern sky, about 20,000 square degrees. Astronomers from around the world can download these composite images at the Deep Sky Project web site.


About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.