# Special Feature: Five Questions for Berkeley Lab's CRD Director

## David Brown Talks About Applying the Magic of Mathematics to Solve Problems Using Supercomputers

September 13, 2013 Tags: CRD

David Brown has been the director of the Computational Research Division at Lawrence Berkeley National Laboratory (Berkeley Lab) since August 2011. His 31-year-career with the U. S. Department of Energy (DOE) national laboratories includes 14 years at Los Alamos National Laboratory and13 years at Lawrence Livermore National Laboratory. An applied mathematician by training, his research expertise and interests lie in the development and analysis of algorithms for the solution of partial differential equations (PDEs). In particular, his research has focused on adaptive composite overlapping grid techniques for solving PDEs in complex moving geometries and in the analysis of difference approximations for PDEs. At LANL and LLNL, he led the highly successful Overture project, which in 2001 was named one of the 100 “most important discoveries in the past 25 years” by the DOE Office of Science.

In 2007, Brown convened an independent panel from the applied math research community to investigate how past, present and future math research supported by DOE’s Office of Advanced Scientific Computing Research could be applied to help tackle challenges being addressed by both the DOE Office of Science and the offices of Nuclear Energy, Fossil Energy, Environmental Management, Legacy Management, Energy Efficiency and Renewable Energy, Electricity Delivery and Energy Reliability, and Civilian Radioactive Waste Management, as well as the National Nuclear Security Administration. (Read the panel’s report.)

Brown earned his Ph.D. in Applied Mathematics from the California Institute of Technology in 1982. He also holds a B.S. in Physics and an M.S. in Geophysics from Stanford University. In honor of DOE's supercomputing month, we asked Brown five questions about how applied math is making supercomputers even more powerful tools for scientific discovery.

**Q: Can you describe your career path? Is it something you could have plotted out when you were a student?**

**David Brown:** Not really. Like nearly other student going through grad school, I figured I would end up being a professor, in my case teaching applied mathematics. Then I got a post-doc position at Los Alamos National Laboratory. I figured I would spend a couple of years there, then go teach. But I never left. At that time, the best places one could get access to supercomputers were at DOE or NASA labs. There were so many interesting problems to work on and the collaborative laboratory environment encouraged us to build teams for collaborative science. Later, when I moved to Lawrence Livermore National Lab, I was able to apply my knowledge of math and science to the development and oversight of new research opportunities for scientists and mathematicians at that lab and throughout the DOE.

**Q: So how does math apply to supercomputers?**

**DB:** The scientific performance of big applications on supercomputers is as much a result of better mathematical models and algorithms as it is of increases in computer performance. In fact, the increases in performance of many scientific applications resulting from these better models and algorithms has often exceeded the performance increases due to Moore’s Law . And Moore’s Law, which predicts of doubling of performance every 18 months, offers a pretty impressive increase on its own. These improvements in performance help scientists make much more efficient use of supercomputers and study problems in greater detail.

**Q: You’re responsible for an extensive research program in applied mathematics at Berkeley Lab. Why does the DOE invest in math research?**

**DB:** Mathematics is the language of science, and in particular, the language that allows science to be put on computers. DOE invests in mathematics research to develop new and better mathematical theories, models and algorithms that allow us to model and analyze physical and engineered systems that are important to DOE’s mission. Often math is used to make a very difficult problem tractable on computers. As an example 30 years ago, one of our mathematicians, James Sethian, helped discover how to use asymptotic methods to create equations that could simulate combustion much more effectively and efficiently than had been done before. This discovery has since become the basis for modern supercomputer codes that are enabling discoveries in fields as diverse as combustion, astrophysics and atmospheric flow.

**Q: Can you give a few examples of how that research pays off?**

**DB:** When I began my career, one of the holy grails of math was being able to simultaneously track a large number of interacting surfaces – tracking just one surface was hard enough in those days. Earlier this year, Sethian and Robert Saye published a paper that successfully solved this problem, focusing on bubbles. Saye and Sethian discovered sets of equations that could describe clusters of hundreds of bubbles. One set of equations described the gravitational draining of liquid from the bubble walls, which thin out until they rupture. Another set of equations dealt with the flow of liquid inside the junctions between the membranes. A third set handled the wobbly rearrangement of bubbles after one pops. Using a fourth set of equations, the mathematicians solved the physics of a sunset reflected in the bubbles, taking account of thin film interference within the bubble membranes, which can create rainbow hues like an oil slick on wet pavement. Then they used the Hopper supercomputer at the National Energy Research Scientific Computing Center to solve the full set of equations of motion. This work has applications ranging from developing foams for firefighting to understanding the mechanics of cell growth. [Read more.]

Another mathematical method, known as adaptive mesh refinement or AMR, allows scientists to automatically focus the power of supercomputers on the most interesting parts of the problem, giving them a much more detailed understanding of what’s happening. Before AMR, computers typically broke a problem down into a grid of uniformly sized squares. AMR allows them to create squares and rectangles of varying sizes to focus on key areas, such a moving flame front in a combustion model. One AMR developer described the method as a “numerical microscope.” This method is being used to study problems such as melting ice sheets in Antarctica, supernovas, combustion and even in national security. AMR was largely developed by mathematicians at DOE national laboratories.

**Q: What do you find most interesting about math?**

** **

**DB:** I’ve always been attracted by the beauty in the formalism of mathematics, how you can take a physical reality and turn it into mathematical equations. You can then use those equations to predict things you wouldn’t have observed or anticipated – they can tell you what to go looking for next when you’re doing experiments.

It’s kind of like magic.

##### About Berkeley Lab Computing Sciences

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe. ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 5,500 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation.