Supercomputers Speed Search for New Subatomic Particles
Lattice QCD Computations Run at NERSC Help Find Unexpected 'Penguin' Decay
December 2, 2015
Contact: Kathy Kincade, firstname.lastname@example.org, 510-495-2124
A team of theoretical high-energy physicists in the Fermilab Lattice and MILC Collaborations has published a new, high-precision calculation that could significantly advance the indirect search for physics beyond the Standard Model. The calculation applies to a particularly rare decay of the B meson (a subatomic particle), which is sometimes also called a “penguin decay” process.
Their findings were published October 7, 2015, in Physical Review Letters.
After being produced in a collision, subatomic particles spontaneously decay into other particles, following one of many possible decay paths. Out of one billion B mesons detected in a collider, only about 20 decay through this particular process.
With the discovery of the Higgs boson, the last missing piece, the Standard Model of particle physics now accounts for all known subatomic particles and correctly describes their interactions. It’s a highly successful theory, in that its predictions have been verified consistently by experimental measurements. But scientists know that the Standard Model doesn’t tell the whole story, and researchers around the globe are eagerly searching for evidence of physics beyond the Standard Model.
“We have reason to believe that there are yet undiscovered subatomic particles that are not part of the Standard Model,” explained Fermilab scientist Ruth Van De Water. “Generally, we expect them to be heavier than any subatomic particles we have found so far. The new particles would be part of a new theory that would look like the Standard Model at low energies. Additionally, the new theory should account for the astrophysical observations of dark matter and dark energy. The particle nature of dark matter is a complete mystery.”
“Scientists are attacking this problem from several directions, noted University of Illinois physicist Aida El-Khadra, one of several co-authors. Indirect searches focus on virtual effects that the conjectured new heavy particles may have on low-energy processes. Direct searches look for the production of new heavy particles in high-energy collisions. The interplay of both indirect and direct searches may ultimately provide us with enough pieces of the puzzle to make out the new underlying theory that would explain all of these phenomena.”
So-called “penguin decays” provide powerful probes of new physics, noted Syracuse University physicist John “Jack” Laiho.
“In the observation of a rare decay, because contributions from the Standard Model are relatively small, there is a good possibility that contributions from new virtual heavy particles may be significant,” he said. “These would be observed as deviations from Standard Model predictions. However, in order to know that such a deviation (if observed) is not just a statistical fluctuation, the difference must be conclusive—it must be at least five times larger than the experimental and theoretical uncertainties. So rare decays require high precision in both the experimental measurements and the theoretical calculations.”
High-precision Lattice QCD
B mesons belong to class of subatomic particles that are bound states of quarks and they feel the so-called strong interactions, also known by the colorful name quantum chromodynamics (QCD). Quarks are found inside protons and neutrons—which make up the atomic nucleus—as well as within other subatomic particles, such as pions and the aforementioned B mesons. The new high-precision calculation employs lattice QCD to calculate the effects of the strong interaction on the process in question.
“Decay processes that involve bound states of quarks receive contributions from the strong interactions, which are very difficult to quantify, especially at low energies,” said Fermilab scientist Andreas Kronfeld, another co-author on the Physical Review Letters paper. “The only first-principles method for calculating with controlled errors the properties of subatomic particles containing quarks is lattice QCD, where the unwieldy integrals of QCD are cast into a form that makes it possible to calculate them numerically.”
The team’s high-precision lattice QCD calculation required large-scale computational resources, including supercomputers and allocations at Fermilab (provided by the USQCD Collaboration), the National Energy Research Scientific Computing Center (NERSC), the Argonne Leadership Computing Facility, Los Alamos National Laboratory, the National Institute for Computational Science, the Pittsburgh Supercomputer Center, the San Diego Supercomputer Center and the Texas Advanced Computing Center.
After completing the new calculation and prior to its publication in Physical Review Letters, the LHCb experiment at CERN in Switzerland announced a new experimental measurement of the differential decay rate for this decay process.
“The recent measurements are compatible with our Standard Model predictions, with commensurate uncertainties from theory and experiment,” said Fermilab scientist and co-author Ran Zhou. “This puts interesting constraints on possible new physics contributions which are very useful for building models of beyond the Standard Model physics.”
Generating the lattices and characterizing them is “a lot of work,” noted Doug Toussaint, professor of physics at the University of Arizona and the PI who oversees this project’s allocation at NERSC. This latest finding “completed the picture of all things that are possible in the decay of this particular meson,” he said. “We have been generating (by Monte Carlo simulation) sample configurations of fields for QCD. Then it’s possible to calculate many things by averaging the desired quantity over these sample configurations, or lattices. Our group and other groups have
used these stored lattices to calculate many strong interaction masses and decay rates in addition to this latest one. A lot of our computer time goes into generating these samples, and a lot of those used in this latest project were generated at NERSC.”
This article was adapted from materials provided by University of Illinois College of Engineering.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 7,000-plus scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are Department of Energy Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.