InTheLoop | 10.14.2013
NERSC User Martin Karplus Wins Nobel Prize in Chemistry
On Wednesday, the Nobel Prize in Chemistry was awarded to three scientists for pioneering methods in computational chemistry that have brought a deeper understanding of complex chemical structure and reactions in biochemical systems. These methods can precisely calculate how very complex molecules work and even predict the outcomes of very complex chemical reactions. One of the laureates—Martin Karplus of Harvard University—has been using supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) since 1998. Read more.
In Memoriam: Dave Stevens, a Steady Hand in Guiding Early Scientific Computing
David F. Stevens, whose lab career spanned 40 years during which he worked for every lab director except Ernest Lawrence, has died. He began his career as a mathematical programmer in Bldg. 46 and was serving as the lab’s liaison to DOE, reviewing directives and formulating responses, when he retired in December 1997. Hired in 1960, Stevens continued working at the lab as a guest scientist until 2000. Along the way, he had a stint as a computing consultant at CERN, worked with many of the lab’s supercomputers in the 1960s and ‘70s, and headed the computer security program. He also served on several national standards committees. Read more.
Article on Using Computers to Study Human Brain Cites Ushizima’s Visualization Work
UX Magazine, which focuses on “everything related to user experience,” recently published an article on brain research. “It’s inevitable a powerful problem-solving machine like the human brain would turn increasing attention to one of the greatest puzzles of all: itself,” the article began. “Ironically, it may be impossible for us to decipher many of the secrets of our own brains without significant help from the digital tools we’ve created. Advances in machine learning and neuroimaging, in combination with visualizations that leverage human cognitive and perceptual strengths, are paving the way toward a far better understanding of our brains.”
To illustrate ideas on brain visualization, writer Hunter Whitney cited work done by CRD’s Daniela Ushizima in collaboration with UC San Francisco and Oblong Industries. Ushizima’s research looks using visualizations to create more precise treatment of neurological disorder. Read the article.
HPCwire Reports on Bert de Jong’s Research on Controlling for Soft Errors
Bert de Jong, leader of CRD’s Scientific Computing Group, co-authored a recently published paper on controlling for soft errors, or incorrect behaviors that can undermine the validity of simulations, in the Journal of Chemical Theory and Computation. The paper, written while de Jong was still at Pacific Northwest National Laboratory, was also written up in HPCwire.
“As the number of cores per machine increases, incorrect behaviors, known as soft errors, begin to threaten the validity of simulations,” wrote Tiffany Trader. “When you consider that exascale machines will employ billion-way parallelism, the necessity to address this problem is clear.
“A team of scientists from PNNL performed experiments revealing the high risk of soft errors on large-scale computers. The research team found that without intervention, soft errors invalidate simulations in a large fraction of cases, but they also developed a technique that will correct 95 percent of them.”
This Week’s Computing Sciences Seminars
Metastability and Coarse-graining of Stochastic Systems
Monday, October 14, 2 – 3 p.m., Bldg. 50F, Room 1647
Jianfeng Lu, Duke University
Abstract: The study of rare events in physical, chemical and biological systems are important and challenging due to the huge span of time scales. Coarse-graining techniques, Markov state models for example, are employed to reduce the degree of freedom of the system, and hence enables simulation and understanding of the system on a long time scale. In this talk, we will introduce a novel construction of Markov state model based on milestoning. We will discuss the quality of approximation when the original system is metastable. The analysis is based on transition path theory. The analysis identifies quantitative criteria which enable automatic identification of metastable sets.
An Asymptotic Parallel-In-time Method for Highly Oscillatory PDEs
Tuesday, October 15, 11 a.m. – 12 p.m., Bldg. 50B, Room 4205
Terry Haut, Center for Nonlinear Studies, Los Alamos National Laboratory
Abstract: We present a new time-stepping algorithm for nonlinear PDEs that exhibit scale separation in time. Our scheme combines asymptotic techniques (which are inexpensive but can have insufficient accuracy) with parallel-in-time methods (which, alone, can be inefficient for equations that exhibit rapid temporal oscillations). In particular, we use an asymptotic numerical method for computing, in serial, a solution with low accuracy, and a more expensive fine solver for iteratively refining the solutions in parallel. Examples on the shallow water equations demonstrate that orders of magnitude speedup and high accuracy are achievable.
How the Method of Frobenius Led to Self-Replicating 3D Zombie Vortices in Turbulent Lab Flows and the Formation of Stars
Wednesday, October 16, 3:30 - 4:30 p.m., 939 Evans Hall - UC Berkeley Campus
Philip Marcus, University of California, Berkeley
Abstract: We report a new mechanism for creating vortices in a class of flows that are linearly stable and believed, by most researchers, to be also finite-amplitude stable. We find that the vortices form in numerical simulations of stably-stratified Couette flows (both plane and circular), as well as in simulations of protoplanetary disks around forming protostars. Our study was motivated by the fact that protoplanetary disks must have flow instabilities to form stars. The mechanism that we discovered allows small-amplitude perturbations (i.e., with small volumes and Rossby numbers) to form vortices that are large in volume and amplitude (with a Rossby number of order unity). The energy of the vortices becomes large, and it is supplied the kinetic energy of the background shear flow. The underlying mathematics of the finite-amplitude instability lies in Math 53 and 54. Our vortices have an unusual property: a vortex that grows from a single, local perturbation triggers a new generation of vortices to grow at nearby locations. After the second generation of vortices grows large, it triggers a third generation. The triggering of subsequent generations continues ad infinitum so that a front dividing the vortex-dominated flow from the unperturbed flow advances until the entire domain fills with large vortices. The vortices do not advect across the region, the front of the vortex-populated fluid does. The region in protoplanetary disks where we have found this new mechanism is thought to be stable; thus, in the astrophysical literature this region is called the dead zone. Because the vortices we report here arise in the dead zone, grow large, and spawn new generations of vortices that march across the domain, we refer to them as zombie vortices. We consider the mechanism of the zombie vortices’ growth and advance in a proposed lab experiment: circular Couette flow with a vertically stably-stratified Boussinesq fluid (i.e., salt water) with a density that is linear with height. Because this flow is nearly homogenous, the first vortex formed by the initial instability self-replicates in an approximately spatially self-similar manner and fills the domain with a lattice of 3D vortices, which persists, despite the fact that the flow is turbulent.