A-Z Index | Directory | Careers

NERSC Expands the Frontiers of Science

August 5, 2009

Contact: Linda Vu, CSnews@lbl.gov


"NERSC's Franklin system made us one of the leading teams in the world for researching complicated material systems, and we are determined to maintain our competitive edge in materials research by utilizing the center's high power supercomputing resources,"

-Wai-Yim Ching, physics professor at the University of Missouri in Kansas City.


The Department of Energy's (DOE) National Energy Research Scientific Computing (NERSC) Center recently accepted a series of upgrades to its Cray XT4 supercomputer, named Franklin, providing the facility's 3,000 users with twice as many processor cores and an expanded file system for scientific research. From nuclear physics to material sciences, the additional resources have already facilitated breakthrough in a variety of disciplines.

By next year, science users will have access to even more resources when NERSC's next flagship supercomputer is completed. When completed, the new system will deliver a peak performance of more than one petaflops per second, equivalent to more than one quadrillion calculations per second.

In the following stories, NERSC users discuss how they have benefitted from the Franklin upgrades and the breakthroughs they could achieve with additional resources.

Enhancing Ceramics

“NERSC's Franklin system made us one of the leading teams in the world for researching complicated material systems, and we are determined to maintain our competitive edge in materials research by utilizing the center’s high power supercomputing resources,” says Wai-Yim Ching, a professor of physics at the University of Missouri in Kansas City.

This past year, Ching led a project to identify the mechanical weaknesses of intergranular glassy films (IGF) by performing theoretical experiments with the expanded Franklin system. IGFs are ubiquitous microstructures found in all kinds of polycrystalline ceramics and may even exist in metals. Many of the physical properties of these materials are controlled by the properties of the IGFs. Ceramic materials are used in a wide variety of settings from structural materials to electronics to biomaterials. This fundamental research provides valuable insights that allow engineers to enhance the IGFs' useful properties and thereby enhance the materials that we rely on every day.

In another project, Ching's research group used the upgraded Franklin system to develop a novel technique called spectral imaging (SI) as a complement to experimental electron microscopy research. SI can reveal subtleties about the interactions between atoms within complex microstructures. Electron microscopy experiments can create high-resolution atomic scale images of materials (some can magnify samples more than 2 million times). Although this level of magnification can illuminate a lot of details about the atomic structure of a material, there are practical difficulties when studying important non-crystalline systems or buried defects. The SI technique complements experimental electron microscopy research by utilizing supercomputers to identify differences in atomic interactions due to the presence of defects and microstructures in materials.

“To construct the multi-dimensional datasets that we will use for SI analysis, we must perform atom-by-atom ELNES [energy-loss near edge structure] spectral calculations. The computational task for a system containing thousands of atoms can be spread across that many processors. Therefore, because the upgraded Franklin system has more than 38,000 processors, we can study many structures more efficiently, and other larger and more complex structures for the first time,” says Ching.

Speeding Up Data Storage

SPEEDING UP STORAGE WITH LASERS: This is a first comprehensive and kaleidoscope

SPEEDING UP STORAGE WITH LASERS: This is a first comprehensive and kaleidoscope view of and laser energy. A team lead by Guoping Zhang used NERSC's Franklin to acquire the first glimpse of how electrons in magnetic materials evolve at timescales of a quadrillionth of a second. Insights from this research will allow future computers to save data to disk 1,000 times faster than existing technology

A project led by Guo-Ping Zhang, a professor of physics at Indiana State University, used the expanded Franklin system to acquire the first glimpse of how electrons in magnetic materials evolve at timescales of a quadrillionth of a second. Insights from this research will allow future computers to save data to disk 1,000 times faster than existing technology. Today's computers need about 40 to50 milliseconds to write data onto a disk.

“Once we understand what electrons are doing, we can manipulate or control them externally. Once we are able to manipulate them, we can build a device to use them to store data on magnetic disks on a much shorter time scale,” says Zhang.

He notes that access to Franklin has significantly increased his scientific output. To do this type of experiment on a local computer cluster with approximately 50 processors would take from one to three years. Zhang adds that with access to 2,000 to 4,000 processor cores on Franklin, his team can get results from the same simulation in about two-weeks.

“Our Franklin runs allow us to compete with our competitors. Before our Franklin runs we were publishing one paper every two to four years in this area of research,” says Zhang. “Now we publish about three papers a year with our Franklin results.”

Zhang admits that the limited computing resources are the biggest roadblocks to his research. He notes that functional materials, like magnets, have a complicated structure. Computationally, this means that there are more differential equations to solve, which requires more resources.

If more computing resources were to become available, Zhang says he would like to run simulations where two laser pulses interact with a magnetic material. The first pulse would influence the electrons, and the second would detect the effects of the first laser pulse. Once scientists understand these effects, they can design the next generation of input-output technology.

Better Fuel Cell Design

As a DOE Nanoscale Science Research Center, the Center for Nanophase Material Sciences (CNMS) at the Oak Ridge National Laboratory helps researchers across the country synthesize, process, fabricate and analyze materials at the nanoscale. Every year, the CNMS applies to NERSC for supercomputing time, then metes this time out to their users. About a dozen projects have used CNMS hours on Franklin since the upgrade.

“In the field of material sciences, throughput is extremely important because we usually have many systems to study, and at the end of a run, a scientist still needs to analyze the data. If the computation takes months to produce a result before the researcher can analyze the data, that is a potential waste of resources,” says Paul Kent of CNMS.

“The upgraded Franklin system allowed our researchers to simulate larger, more complicated systems and get throughput fast enough to make good scientific progress.”

According to Kent, the additional resources allowed scientists to simulate more complex structures that would have taken too long to compute or been impossible to compute previously. In one case, Matthew Neurock of the University of Virginia used the expanded system to simulate how platinum nanoparticles interact in water. Platinum particles are used as catalysts in fuel cells. A better understanding of their properties in the aqueous environment could lead to more efficient fuel cell designs.

“This type of simulation is extremely complex, you have to look at systems with thousands of atoms and model the different properties that each structure has,” says Kent. “This type of research is very processor intensive and it wouldn’t have been possible without the additional resources and improved throughput that resulted from the upgrades.”

Understanding Global Climate

“We are very pleased with the improvements on Franklin. Since these upgrades were completed, our scientific output has increased by 25 percent — meaning we can make more historical weather maps in significantly less time,” says Gil Compo, climate researcher at the University of Colorado at Boulder CIRES Climate Diagnostics Center and NOAA Earth System Research Laboratory. “NERSC's excellent computing facilities and phenomenally helpful consulting staff were extremely important to our research.”

A NERSC user since 2007, Compo is leading a project to reconstruct global weather conditions in six-hour intervals from 1871 to the present. Called the 20th Century Reanalysis Project, Compo notes that these weather maps will help researchers forecast weather trends of the next century by assessing how well computational tools used in projections can successfully recreate the conditions of the past.

With awards from DOE's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, Compo's team used NERSC systems to recreate the deadly Knickerbocker Storm of 1922 that killed 98 people and injured 133; as well as the 1930s Dust Bowl, which left half a million of people homeless as dust storms rolled over the drought-barren Great Plains of the United States.

Ocean Circulation

 OCEAN EDDIES: This image comes from a computer simulation modeling eddies in th

OCEAN EDDIES: This image comes from a computer simulation modeling eddies in the ocean. An interesting feature is the abundance of eddies away from the equator, which is shown in the center of the image at y=0. This image comes from a research collaboration led by Paola Cessi of the Scripps Institute of Oceanography, which performed over 15,000 years worth of deep ocean circulation simulations with 1.6 million processor core hours on the upgraded Franklin system.

Another project led by Paola Cessi of the Scripps Institute of Oceanography performed over 15,000 years worth of deep ocean circulation simulations on Franklin. The project utilized approximately 1.6 million processor core hours. Insights from these experiments will give researchers a better understanding of how oceans circulate and how changes in the atmosphere affect these processes.

Energy from the Sun does not fall equally on Earth. Most sunlight is actually absorbed at the equator, and heat is transported across the globe via oceans. Cessi's project used Franklin to simulate how mesoscale oceanic flows, which are driven by surface winds and differences in solar heating, bring heat from the deep ocean to the surface. Since oceans are the biggest repositories of carbon dioxide on the planet, this research could also provide valuable insights about how oceans cycle greenhouse gases into the atmosphere and across the globe.

“As a result of the Franklin experiments, we have been able to demonstrate that the Southern Ocean exerts remarkable control over the deep stratification and overturning circulation throughout the ocean,” says Christopher Wolfe, a researcher at the Scripps Institute of Oceanography and a member of Cessi's team.

According to Wolfe, most of the abrupt climate change studies to date have focused primarily on how density changes in the North Atlantic oceans affect the strength of the Meridional Overturning Circulation (MOC), which controls the overturning circulation throughout the oceans.

“The results of our simulations indicate that changes in the Southern Ocean forcing could also have a large impact on MOC and, therefore, on global climate,” says Wolfe.

Nuclear Physics

“Our research benefitted from the extra compute time available to general scientific research as a result of the quadcore upgrade on Franklin. Though our code runs on pretty much any supercomputer, the superior networks of the Cray XT machines make it a desirable platform for us,” says William Detmold, assistant professor of physics at the College of William and Mary.

With the additional compute time, Detmold and his colleagues achieved the first quantum chromodynamics (QCD) calculations of the three-body force between hadrons. A better understanding of these interactions could ultimately improve models of nuclei, as well as provide valuable insights into the life and death of stars.

QCD is the theory that describes the complex interactions, or the strong force, between quarks—the constituents of protons, neutrons and certain other subatomic particles.

“For a long time, nuclear physicists have tried to reproduce the spectra of nuclei using models that only involve pairwise interactions between protons and neutrons. But in recent times, many examples have been found where this approach fails to reproduce experimental data, and a direct three nucleon interaction is needed to better describe experiment,” says Detmold.

Although no laboratory experiment or computer simulation has ever been able to directly constrain the three-body interaction between nucleons, Detmold notes that the recent Franklin QCD calculations may pave the way for this type of research by confirming current theories about how the strong force works.

Instead of calculating three-body interaction between nucleons, Detmold and his colleagues used their understanding of the strong interaction to calculate three-body interactions between systems that are less computationally intensive. One calculation looked at three-body interaction between pions, the simplest composites formed from quarks in nature.

Understanding Oxygen

James Vary, a professor of physics at Iowa State University, led a research team that used Franklin to calculate the energy spectrum of the atomic nucleus of oxygen-16, which contains eight protons and eight neutrons. This is the most common isotope of oxygen and it makes up more than 99 percent of the oxygen that humans breathe. He notes that this is currently the most complex nucleus being studied with his group's methods.

“The quad-core upgrade had a significant impact on our ability to push the frontiers of nuclear science. The expanded machine made 40,000 compute cores available for our research, which allowed us to calculate the energy spectrum of a nucleus that would have been previously impossible,” says Vary.

According to Vary, physicists currently do not fully understand the fundamental structure of the atomic nucleus, especially the basic forces between the protons and neutrons. A better understanding of these principles could help improve the design and safety of nuclear reactors, as well as held scientists better understand the life cycles of stars.

The current method for understanding the fundamental structure of atomic nuclei involves a lot of back-and-forth between computer simulations and laboratory experiments. Researchers use their theories about how the fundamental structures and forces work to create computer models that will calculate the energy spectrum in the nucleus of a particular atom. The results of these calculations are then compared with laboratory experiments to identify any discrepancies. The inconsistencies between the computer calculations and lab experiments give researchers insight about how to refine their theories. The heavier the atom, or the more protons and neutrons it has, the more complex its fundamental structure and interactions. The ultimate goal in physics is to precisely predict how protons and neutrons interact in extremely complex systems.

“The calculations that we did on Franklin gave us very valuable information about how to refine our theories and methods, and tell us that we have a long way to go before we can begin to understand extremely complex systems like the nucleus of a Sodium atom,” says Vary. His research on Franklin was conducted with an award from the Energy Research Computing Allocations Process (ERCAP).

Scientific Visualization

A SUPERNOVA'S VOLUME: This volume rendering of supernova simulation data was gen

A SUPERNOVA'S VOLUME: This volume rendering of supernova simulation data was generated by running the VisIt application on 32,000 processors on Franklin, a Cray XT4 supercomputer at NERSC.

As computational scientists are confronted with increasingly massive datasets from supercomputing simulations and experiments, one of the biggest challenges is having the right tools to gain scientific insight from the data. A team of DOE researchers recently ran a series of experiments on some of the world's most powerful supercomputers and determined that VisIt, a leading scientific visualization application, is up to the challenge.

The team ran VisIt on 32,000 processor cores on the expanded Franklin system and tackled datasets with as many as 2 trillion zones, or grid points. The data was loaded in parallel, with the application performing two common visualization tasks —isosurfacing and volume rendering — and producing an image.

“These results are the largest-ever problem sizes and the largest degree of concurrency ever attempted within the DOE visualization research community. They show that visualization research and development efforts have produced technology that is today capable of ingesting and processing tomorrow's datasets,” says E. Wes Bethel of the Lawrence Berkeley National Laboratory (Berkeley Lab). He is a co-leader of the Visualization and Analytics Center for Enabling Technologies (VACET), which is part of DOE's SciDAC program.

“NERSC plays a valuable role in providing facilities for conducting computer and computational science research, which includes experiments like the one we did,” adds Bethel.


About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.