HPC Explorations of Supernova Explosions Help Physicists Reach New Milestones
3D modeling at NERSC opens new doors in collaborative research
February 25, 2021
Physicists have been studying the question of how supernova explosions occur for more than 60 years. Thanks to the increasing power of supercomputing resources such as those at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory, they’re moving ever closer to an answer.
Most supernovae are the deaths of stars, massive explosions punctuating the ends of their existences after millions of years. These explosions leave remnants behind: neutron stars, black holes, and newly synthesized chemical elements added to the interstellar medium. But how, exactly, does an ancient star explode after millions of years?
As newly detailed in the review article “Core-collapse supernova explosion theory,” published January 6, 2021 in Nature, when a white dwarf star is birthed from a progenitor star and reaches a mass of about 1.5 solar masses, it becomes gravitationally unstable to implosion. At the key moment, the star’s dense core collapses in less than a second. This collapse causes an outward-traveling shockwave known as the “bounce,” which could cause a supernova, but doesn’t; models show it stalling out due to accretion before it can trigger the massive explosion that rips the star apart. Instead, simulations show that the supernova explosion is primarily driven by the asymmetry of the progenitor star, turbulent convection caused by extreme heat and density resulting from the collapse and the bounce, and the interactions of enormous quantities of neutrinos produced inside.
“This problem [of explaining supernova explosions] has been around for 60 years, and it parallels developments in physics and the availability of supercomputers,” said Adam Burrows, professor of astrophysical sciences at Princeton University and a longtime NERSC user who co-published the paper with David Vartanyan of the University of California, Berkeley. For decades, scientists toiled over painstaking models of supernova explosions — but progress only moved as quickly as the computers were able to run their simulations.
“With more dimensions comes more complexity,” said Burrows. Scientists originally began modeling supernova explosions in a single dimension, adding a second dimension and later a third when computing power could accommodate it — though only in the last five to ten years has more than one 3D simulation been possible per year. Until recently, simulations took months or more to develop.
Multiple Complex 3D Simulations
Today, with the speed and power of high-performance computing systems at facilities like NERSC, these modeling processes can now be completed much more quickly and frequently, offering the chance to run multiple complex 3D simulations each year.
“NERSC is home to the first machine where we could get 3D [modeling] going with the code we’re using now,” said Burrows. “We developed the code on NERSC’s Cori system, using regular default time, and it provided the foundation we needed. It’s very well suited to what we’re doing.”
This increased frequency allows scientists to tinker with their simulations and explore different profiles and approximations of the explosions — which will only become more useful as scientists pursue the details of supernova explosions and the role of entropy, or chaos, in the turbulent convection that causes them.
“What you want is to be able to make a lot of mistakes, fast,” said Burrows. Additionally, capacity for more simulations allows different teams to overlap and begin to form consensus, yielding a clearer picture of the mechanisms behind supernova explosions.
As high-performance computing technology and power continues to improve, scientists studying supernovae will attempt to zoom in on the details of the explosions, like neutrino oscillation, rotational effects, and the possible influence of magnetic fields — realizing the goals of scientists for over half a century. “This study was an international effort, and it builds on the work of many people over the decades,” said Burrows. “We’re in a time when the stars have aligned.”
NERSC is a U.S. Department of Energy Office of Science user facility.
About Computing Sciences at Berkeley Lab
High performance computing plays a critical role in scientific discovery, and researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.