CRD’s AMR Methods Accelerate MHD Simulations
May 1, 2004
The annual Supercomputing conference held every November is well known as a sort of international watering hole, where many of the world’s leading experts in high- performance computing gather for a week to take stock of the competition, exchange ideas, and make new connections.
At SC2000 in Dallas, Phil Colella, head of the Applied Numerical Algorithms Group at Lawrence Berkeley National Laboratory, and Steve Jardin, co-leader of the Computational Plasma Physics Group at Princeton Plasma Physics Laboratory, were both scheduled to give talks in the Berkeley Lab booth. Colella discussed “Adaptive mesh refinement research and software at NERSC,” where Jardin has conducted his scientific computing for years. Jardin gave a presentation on “A Parallel Resistive MHD Program with Application to Magnetic Reconnection.”
The scientist and the mathematician got to talking between their presentations and one thing led to another. They began an informal collaboration which was soon formalized under the auspices of SciDAC—Jardin is principal investigator for the Center for Extended Magnetohydrodynamic Modeling (CEMM), while Colella is PI for the Applied Partial Differential Equations Center (APDEC). Jardin’s group was able to incorporate the CHOMBO adaptive mesh refinement (AMR) code developed by Colella’s group into a new fusion simulation code, which is now called the Princeton AMRMHD code. “Using the AMR code resulted in a 30 times improvement over what we would have had with a uniform mesh code of the highest resolution,” Jardin says.
The AMRMHD code, developed in conjunction with Princeton researcher Ravi Samtaney and Berkeley Lab researcher Terry Ligocki, is already producing new physics results as well. It powered the first simulation demonstrating that the presence of a magnetic field will suppress the growth of the Richtmyer-Meshkov instability when a shock wave interacts with a contact discontinuity separating ionized gases of different densities. The upper and lower images in Figure 3 contrast the interface without (upper) and with (lower) the magnetic field. In the presence of the field, the vorticity generated at the interface is transported away by the fast and slow MHD shocks, removing the driver of the instability. Results are shown for an effective mesh of 16,384 Χ 2,048 points which took approximately 150 hours to run on 64 processors of Seaborg—25 times faster than a non-AMR code.
Another new physical effect discovered by the AMRMHD code is current bunching and ejection during magnetic reconnection (Figure 4). Magnetic reconnection refers to the breaking and reconnecting of oppositely directed magnetic field lines in a plasma. In the process, magnetic field energy is converted to plasma kinetic and thermal energy.
The CEMM project has been collaborating with other SciDAC software centers in addition to APDEC. For example, in a collaboration that predated SciDAC, the group developing the M3D code was using PETSc, a portable toolkit of sparse solvers distributed as part of the ACTS Collection of DOE-developed software tools. Also in the ACTS Collection is Hypre, a library of preconditioners that can be used in conjunction with PETSc. Under SciDAC, the Terascale Optimal PDE Solvers (TOPS) Center worked with CEMM to add Hypre underneath the same code interface that M3D was already using to call the PETSc solvers. The combined PETSc-Hypre solver library allows M3D to solve its linear systems two to three time faster than before.
According to Jardin, fusion plasma models are continually being improved both by more complete descriptions of the physical processes, and by more efficient algorithms, such as those provided by PETSc and CHOMBO. Advances such as these have complemented increases in computer hardware speeds to provide a capability today that is vastly improved over what was possible 30 years ago (Figure 5). This rate of increase of effective capability is essential to meet the anticipated modeling demands of fusion energy research, Jardin says.
“Presently, we can apply our most complete computational models to realistically simulate both nonlinear macroscopic stability and microscopic turbulent transport in the smaller fusion experiments that exist today, at least for short times,” Jardin says. “Anticipated increases in both hardware and algorithms during the next five to ten years will enable application of even more advanced models to the largest present-day experiments and to the proposed burning plasma experiments such as ITER [the International Thermonuclear Experimental Reactor].”
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.