Visualizing the unseen forces of turbulence
Supercomputers are giving researchers a new look at the problem
September 16, 2008
Its invisible eddies and vortexes can dramatically alter the flight of everything from golf balls to hypersonic jets. Grasping the vast power of turbulence could help researchers design better weather forecasts, more efficient cars, quieter helicopters and even faster ships that “float” through the high seas on a cushion of air.
To really understand the chaotic force governing the flow of gases and liquids, though, we have to see how it works, and until a few years ago, researchers lacked both the computing firepower and the tightly focused programs to produce a clear picture of turbulent motion.
No longer. Smarter computer codes and unprecedented supercomputing speed have enabled researchers to accomplish once-unheard feats, like simulating the entire sequence of jet fuel combustion in a Pratt & Whitney 6000 engine. By coupling ever-faster computers with a powerful suite of new processors and programs, scientists are beginning to pry apart one of the most intractable of all physics problems.
Water flowing from a half-open tap may have a fairly orderly, or laminar, motion. “But laminar flow is an exception rather than the rule,” said Parviz Moin, director of the Center for Turbulence Research at Stanford University. As the more dominant counterpart, turbulence underlies motion as varied as air buffeting an airplane wing and blood racing through the left ventricle of the heart.
To make sense of the chaos, researchers rely on a set of mathematical formulas known as the Navier-Stokes equations, derived from Newton’s laws of motion. The equations can reveal the velocity, pressure and air density associated with turbulent flow, allowing researchers to calculate critical parameter like lift, drag and twisting forces. But the enormous scales in both the speed and size of turbulence — consider a tiny eddy of fuel versus a swirling hurricane —have ruled out any pen-and-paper solutions.
Even computers have sagged under the strain of correctly modeling the airflow over, say, an entire 747. “To be able to resolve all of these scales, both large and small, you need an enormous amount of computational horsepower,” Moin said. Fortunately, that power has increased by several orders of magnitude over the past decade, bringing supercomputers into the petaflop range of calculations for the first time — or 1 quadrillion mathematical computations per second.
Computers have their limits, of course. Ten years ago, Moin estimated that it would take thousands of years to compute the airflow over an entire commercial plane for just one second of flight time. Despite a decade’s worth of increasing power, he said, the solution would still tie up a top-notch computer for years or even decades. “A pure solution of the Navier-Stokes? It’s an impossible task and will remain so, even through the next few generations of supercomputers,” he said.
Combining the increased speed with smarter programs, however, has allowed researchers to solve other tasks once considered hopeless. Ten years ago, simulating the turbulence-rich process of fuel combustion was limited to rather academic exercises. Now, Moin said, “we can simulate the entire combustion of an operating jet engine, the liquid fuel breaking up into droplets and evaporating and burning, all of the chemical reactions modeled, and the geometry of the combustion engine.” His team’s simulation of a Pratt & Whitney 6000 engine, he said, has agreed well with physical tests, including the predicted temperature at the exit and the pollutants formed.
Visualizing an invisible force
“We can now ask, ‘What if?’ type questions: ‘What if we tried to manage the flow or turbulence in this way or that way?’” Moin said. The new line of questioning has led him and his colleagues to simulate how the turbulent flow of water around a ship’s hull might be disrupted with well-placed injections of air. Experiments so far suggest a thin barrier of air between the hull and water could increase a ship’s speed by as much as 40 percent by reducing friction. Among the associated questions the team is hoping to answer: Under what conditions would that air stay put?
Another ‘What if?’ question could help reduce the noise associated with helicopter blades. The racket, Moin said, can be explained by tip vortexes coming off one blade and hitting another. “The question had been how to modify the shape of the blade so we can reduce this noise. The military is now very interested in doing that,” he said, citing the obvious utility of a stealthier helicopter.
Useful solutions may rely to a large degree on how well researchers can visualize the unseen forces contributing to turbulence. “Visualizing typically works very well because we are innately visual creatures,” said Wes Bethel, Group Leader of the Visualization Lab at Lawrence Berkeley National Laboratory in Berkeley, Calif. “We can spot trends or features that would be hard to detect mathematically.”
The trick is to transform the hyper-complex phenomenon of turbulence into an accurate visual representation. But the supercomputers needed to produce those informative and often colorful displays, Bethel said, also have complicated the task by generating “gobs” of data that can create bottlenecks in researchers’ analytical programs. While the amount of information is rapidly escalating, he said, “our cognitive capacity is fixed – we can only jam so much stuff into our heads.”
Bethel calls this problem “the meat grinder syndrome.” By scaling up to a bigger “meat grinder” of a visualization program, a researcher can significantly increase the throughput of information. “But it doesn’t guarantee that you’re going to increase scientific knowledge and discovery,” he said. “You’re just throwing more meat at the user to chew through.”
How much meat? Supercomputing hubs have been roughly doubling their number of processing cores every few years, and the U.S. Department of Energy’s national laboratories now boast four of the top five supercomputers in the world. Earlier this year, IBM’s Roadrunner computer at Los Alamos National Laboratory topped the petaflop computing benchmark for the first time ever.
Bringing bytes down to size
For a dataset running well into the trillions of bytes, it would be impractical to do a brute-force analysis, Bethel said.
Fortunately, researchers are turning to higher-capacity multicore platforms and graphics processing units, or GPUS, to lighten the load. Some of the muscular new graphics-processing chips have come courtesy of the video game industry, such as the GPUs in Sony’s PlayStation 3 supplied by Santa Clara, Calif.-based NVIDIA.
Engineers and scientists at NASA’s Ames Research Center at Moffett Field, Calif., for example, have developed a wall of 128 screens, called hyperwall-2, that uses GPUs and processor cores to render computer graphics with a resolution of 250 million pixels. Surpassing that level of detail would require the collective power of 600 video game consoles. Scientists also are implementing smarter programs like a new visual data exploration and mining application that “does in a few seconds formerly what took days or weeks of visualization time,” Bethel said. Other algorithms can track topological features of liquids in four dimensions, including time, thereby attaching solid numbers to the central question of how varying turbulence influences the degree of mixing.
P.K. Yeung, a professor of aerospace engineering at Georgia Institute of Technology in Atlanta, said his team’s simulations of fluid turbulence now incorporate eight times as many grid points as they did three years ago – from 8 billion to 64 billion. The increase, he said, has permitted him to examine much finer details of turbulent flow and how the mixing of fluids takes place in very small dimensions.
The team has had to rewrite computer codes to subdivide computational tasks into increasingly smaller pieces so the communication among fast-multiplying parallel processors can be properly managed. But that added flexibility has allowed Yeung to deal with more complex types of turbulent flow, like the turbulence encountered by fluids with significant variations in density. The work could have broad implications for natural phenomena like hurricanes, where air masses of different densities interact within a vertical column.
“And also in the ocean, we know that the water at the bottom is colder and denser and so there is interest in how the mixing takes place in a vertical dimension,” he said. “If the surface of the ocean gets warmed by 0.1 degrees, what is going to happen to the marine animals living closer to the bottom?”
Beyond climate and marine studies, aerospace applications stand to gain enormously from a more detailed look at turbulence.
A few years ago, NASA engineers lacked the computational power to model the potential impact of tile damage on a space shuttle re-entering the Earth’s atmosphere until after the fact, when the shuttle had landed. Now, those simulations are being run mid-flight , with realistic scenarios of the kind of turbulent forces — and their effects — on the shuttle’s tiles.
Similarly, software engineers at NASA’s Ames Research Center have developed a program called the Data-Parallel Line Relaxation to scrutinize the volatile environments that spacecraft and their occupants might encounter during high-speed entries into the atmosphere of Earth and other planets.
The highly accurate computer simulation does for NASA’s spaceflight program what test facilities cannot, but wind tunnels may not go the way of the dinosaur just yet. Both NASA and Boeing have been able to sharply reduce the number of expensive wind tunnel tests for designing critical elements such as thermal materials and airplane wings.
Despite the enormous potential for computer modeling, however, researchers agree that sometimes there’s no substitution for a reality check in the form of a well-timed blast of turbulent air.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.