When the National Aeronautic and Space Administration (NASA) sets out to study the terabytes of cosmological data from the Planck space telescope, it employs the expertise of many computational physicists and scientific data experts. Reijo Keskitalo, a computer systems engineer at Lawrence Berkeley National Laboratory (Berkeley Lab)’s Scientific Data Division (SciData), was part of a team of researchers who studied that data and published the final paper from the Planck mission detailing the development of the NPIPE processing pipeline for high performance computing (HPC), specifically the National Energy Research Scientific Computing Center (NERSC).

Keskitalo, a member of SciData’s Computational Cosmology Center, was awarded NASA’s Exceptional Public Achievement Medal for “developing novel tools and approaches for maximizing NASA’s understanding of the universe from the Planck mission data.” NASA recognized Keskitalo for his contribution to that work in 2021; however, the COVID pandemic caused the award ceremony to be postponed until June of 2023.

Planck is a European Space Agency mission that launched a satellite telescope into space in May 2009. The Planck mission’s objective is to analyze, with the highest accuracy ever achieved, the cooled remnants of the first light that filled the Universe. This radiation, the furthest any telescope can see, is the afterglow of the Big Bang and is referred to as the Cosmic Microwave Background (CMB). The secrets of the Universe are embedded within the intricate patterns of matter and radiation that exist in the cosmos today.

Planck carries two scientific instruments: the High Frequency Instrument (HFI) and the Low Frequency Instrument (LFI). Their detectors converted the microwave and radio light gathered by the telescope into very accurate maps of the microwave sky. However, the team developed the data processing for the two instruments independently, and the Planck mission needed help processing the data from the HFI and LFI together. They also required efficient execution on current HPC systems as well as adaptability on new or future HPC architectures. For this purpose, SciData engineers developed a new data analysis pipeline called NPIPE.

NPIPE used a previous data analysis framework, TOAST, as its foundation to create a data analysis method that drew inspiration from both LFI and HFI data processing and combined the best features of prior methods to create a single analysis pipeline. NPIPE introduces several improvements to enhance the accuracy and quality of the analysis. For example, it uses special techniques to account for specific measurement errors and correct for differences in frequency response. These improvements result in lower levels of noise and errors in the final maps created from the data. With the help of NPIPE, researchers were able to estimate the solar dipole and the optical depth of reionization more accurately.

“A novelty factor in separating astrophysical components from NPIPE frequency maps comes from the fact that we didn’t remove an estimated solar system dipole from the data during reduction,” said Keskitalo. “This allowed our colleagues in Oslo to include it in the component separation analysis and provided for a more robust estimate.”

The solar dipole, also known as the peculiar velocity dipole, refers to a dipole pattern observed in the CMB. This pattern makes the CMB appear hotter in the direction towards which we are moving and colder in the opposite direction.

To ensure the reliability of the analysis, the team performed 600 simulations on NERSC’s Cori system, including detailed modeling of the data and its processing. The release of NPIPE maps, simulations, and associated software allows other scientists to improve the analysis further and conduct their own simulations using the provided data and tools.

“As a demonstration of the fidelity of the data products and the 600 accompanying simulated noise and signal maps, we estimated the optical depth to reionization using two different methodologies,” said Keskitalo. “This was the first time the Planck 44GHz maps were clean enough to derive independent reionization estimates. The greater inter-frequency consistency improved foreground clearing and allowed us to use more of the sky in the analysis. And of course, the lower noise level reduced overall statistical uncertainty.”

The optical depth of reionization is a way to measure how much of the Universe has transformed from a neutral state to an ionized state. In the early days of the Universe, there were lots of neutral hydrogen atoms floating around. But as time passed, energetic sources like stars and galaxies emitted powerful light that stripped electrons from these atoms, making them ionized. The optical depth tells us how likely light will interact with these orphaned electrons as they travel through space. By studying the optical depth, scientists can learn about when and how this ionization process happened, shedding light on the early history of our Universe and the formation of galaxies.

“As a Planck first, these simulations include full time-domain processing of the beam-convolved CMB anisotropies, the slight variations or fluctuations in temperature or density across different regions of the CMB,” said Keskitalo. “The release of these specialized maps and simulations is accompanied by a complete suite of raw and processed time-ordered data and the software, scripts, auxiliary data, and parameter files that can help improve analysis and assist in running matching simulations.”

The paper detailing this discovery and the Planck intermediate results were published in July 2020. Keskitalo’s colleague at the lab, Julian Borrill, a senior scientist in the CCC, also received the NASA medal for his work on the Planck mission in 2016.

About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.