Brace for Impact: Why Does Matter Dominate Our Universe?
April 30, 2010
Contact: Margie Wylie, MWylie@lbl.gov, 510-486-7421
While the fireworks at CERN's Large Hadron Collider (LHC) transfix the world, theorists are quietly doing some computational heavy lifting to help understand what these particle smash-ups might reveal about the fundamental mystery of existence: Why is there anything at all?
The Standard Model of particle physics can't explain why there exists more matter than antimatter in the universe. At the LHC and other colliders, scientists sift the debris of high-energy particle collisions searching for clues to physics that lie beyond our current understanding.
However, in order for scientists to "claim they've seen something beyond the Standard Mode—a very important claim—they would need to know with quite high precision what the Standard Model predicts," said William Detmold, an assistant professor of physics at the College of William and Mary and senior staff scientist at the Department of Energy's Thomas Jefferson National Accelerator Facility. "That's what we try to calculate," said Detmold, whose group computes at NERSC.
Detmold works in the field of quantum chromodynamics (QCD), the mathematical theory describing the strong force that binds quarks together into protons, neutrons and other less-familiar subatomic particles. QCD also governs how particles interact with each other. Through a computational method called lattice QCD (LQCD), Detmold's team painstakingly calculates the characteristics of subatomic particles in various combinations.
Using NERSC systems, Detmold and colleagues, achieved the first ever QCD calculations for both a three-body force between hadrons and a three-body baryon system. They reported their findings in the journal Physical Review Letters. In a subsequent paper, Detmold and Martin Savage of the University of Washington also reported another first: a QCD calculation that could help scientists better understand the quark soup that was our universe milliseconds after its birth.
A better understanding of these interactions will help physicists build more accurate models of atomic nuclei. The findings also help scientists understand what they ought to see when certain particles collide. Anything outside those values could indicate new phenomena.
While Detmold's research may help form a jumping-off point for the discovery of exotic physics, he emphasizes that the work is confined to the Standard Model we know today: "The overall goal of our project is to provide a QCD-based understanding of the basics of nuclear physics, how protons and neutrons interact with each other and with other particles," Detmold said.
That's more easily said than done. Calculating the properties of subatomic particles is a fiendishly difficult business that requires billions of calculations consuming millions of processor hours. Particles can't simply be torn apart and studied quark by quark, their parts summed to make a whole. Instead, quarks exist in a seething, quantum soup of other quarks, antiquarks, and gluons, all of which must be taken into account (see sidebar "The ABCs of Lattice QCD"). Also, the quantum nature of quarks requires simulations to be run over and over (using random starting points) to derive average values.
"Ideally we'd simulate the whole nucleus of a carbon atom inside our computer and try to directly calculate from QCD its binding energy," said Detmold. Even with today's computing power and algorithms, that’s just not possible. "We're talking exascale-sized computations here," he said.
Instead, Detmold and colleagues substituted simpler proxies in their three-body interactions. One calculation used pions, the simplest composites formed from quarks in nature. Physicists can use these calculations to predict the properties of similar, but more complex structures, atomic nuclei.
Even at their simplest, however, LQCD calculations require massively parallel supercomputers: "Machines like Franklin are very important because they have a large amount of processing power, and the fact that they are highly parallel lets us do our calculations much faster and allows us to do calculations not possible on smaller computers," Detmold said. At NERSC, Detmold's calculations ran on 4,000 cores at once. In 2009 alone, Detmold's team consumed about 30 million processor hours on systems around the world, of which NERSC provided over a third (12 million).
The Search Goes On
Scientists at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory recently reported that they had produced a quark-gluon plasma, the same cosmic soup that existed milliseconds after the Big Bang. The plasma itself lasted a tiny fraction of a second. A major part of the evidence for this exotic state of matter was that it emitted too few J/psi particles, the quark-antiquark pairs that shower from heavy particle collisions. This is known as J/psi screening.
"We wondered if the protons and neutrons not caught up in this plasma could also cause J/psi screening," Detmold said. Calculating a similar interaction with a lattice containing 12 pions, they found a similar J/psi screening effect, albeit to far lesser degree.
"At the RHIC, what's being collided are nuclei, which are all protons and neutrons. The pion system we're looking at is not the system that's there, but it's the simplest multi-hadronic system we can look at," said Detmold. "It indicates that at least some of the screening [at the RHIC] could be coming from hadrons," he concluded.
At NERSC, Detmold's group also concentrates on particles containing bottom-type quarks, called bottom baryons. One of the four major experiments at the LHC is dedicated to probing for novel physics among these particles. Detmold said: "Our studies will contribute an important ingredient in LHC searches for physics beyond the Standard Model in the bottom-baryon sector."
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 7,000-plus scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are Department of Energy Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.