InTheLoop | 11.03.2014
Mathematical Models Shed New Light on Cancer Mutations
A team of researchers from Harvard Medical School, using computing resources at the U.S. Department of Energy’s National Energy Research Scientific Computing Center (NERSC), have demonstrated a mathematical toolkit that can turn cancer-mutation data into multidimensional models to show how specific mutations alter the social networks of proteins in cells.
From this they can deduce which mutations among the myriad mutations present in cancer cells might actually play a role in driving disease. Their latest findings—among the first to be produced from the new Laboratory of Systems Pharmacology (LSP) at Harvard—were published November 2 in Nature Genetics. »Read more
Prabhat Co-Edits Parallel I/O Best Practices Book
In this era of “big data,” high performance parallel I/O is extremely important. Yet, the last book to summarize best practices in this area was written more than a decade ago. To fill the void, Prabhat of the Lawrence Berkeley National Laboratory (Berkeley Lab) and Quincey Koziol of the HDF Group brought together leading practitioners, researchers, software architects, developers and scientists to contribute their insights for a new book called “High Performance Parallel I/O.”
Published on October 24, the editors will be signing copies of the book and answering questions in the Department of Energy’s booth (#1939) at SC14 (Supercomputing Conference, 2014) at 5:15 p.m., Tuesday, November 18. »Read more.
ESnet’s Diversity Support Sends Women to Networking Conference
To help boost the role of women in the networking community, ESnet supported two women attending the 2014 Technology Exchange conference held Oct. 27-31 in Indianapolis. Julie Petersen of Berkeley Lab’s IT Division was one of the two women selected by ESnet, along with Arzu Gosney, an IT manager at Pacific Northwest National Laboratory.
ESnet and Internet2, co-organizers of the conference, each supported two women’s participation to help support increased diversity among research and education networking professionals. »Read more.
This Week's CS Seminars
Supernovae Are Turbulent Beasts: The Crucial Role of Turbulence in Core-Collapse Supernova Explosions
Tuesday, Nov. 4, 1–2 p.m., Bldg. 50F, Room 1647
Sean Couch, Caltech
Core-collapse supernovae, the explosive deaths of massive stars, are fundamentally three-dimensional phenomena. State-of-the-art computational hardware and software have only recently made high-fidelity 3D simulations possible, however, and this is driving a revolution in our theoretical understanding of core-collapse supernovae. I will discuss recent results from cutting-edge 3D simulations of supernovae and focus in particular on my work showing that turbulence plays an central role in aiding successful explosions in 2D and 3D. Strong turbulence behind the stalled supernova shock provides an effective pressure that aids the neutrino heating in pushing the shock outward. This is both a good and a bad thing because, on the one hand, turbulence is more efficient at aiding shock expansion than the neutrino heating, but on the other the resolution required to correctly capture the turbulence simulations may be beyond that of even the highest resolution 3D supernova simulations yet accomplished. The important part turbulence plays also provides a simple explanation for why 2D simulations explode more readily than their 3D counterparts and underscores the growing consensus that 3D simulations are a necessity if we hope to make progress in core-collapse supernova theory.
Arbitrary-order Hybrid Finite-Element Methods for Geophysical Flows
Wednesday, Nov. 5, 2014, 2–3:30 p.m., Bldg. 50A, Room 5132
Jorge Guerra, UC Davis
A new geophysical fluid simulation dynamical core is described with an emphasis on vertical motion.The underlying technology is briefly discussed, including a novel Hybrid Finite Element Method (HFEM) vertical coordinate coupled with high-order Implicit/Explicit (IMEX) time integration to control vertically propagating sound waves. Here, we show results from a suite of Mesoscale testing cases from the literature that demonstrate the accuracy, performance, and properties of our method on regular Cartesian meshes. The test cases include wave propagation behavior, Kelvin-Helmholtz instabilities, and flow interaction with topography. Here we show that high order staggered vertical grids reduce errors that are typical in other vertical discretizations. However, we also discuss the need to incorporate methods of stabilization beyond the inclusion of diffusion operators in the vertical direction.
A Stochastic Description of Fluctuations and Polarity Reversals in the Earth's Magnetic Field
Wednesday, Nov. 5, 3:30–4:30 p.m., 939 Evans Hall, UC Berkeley
Bruce Buffett, University of California, Berkeley
The Earth's magnetic field must be continuously regenerated by convection in the liquid metal core to overcome persistent losses due to ohmic decay. Numerical models are capable of producing a self-sustaining magnetic field, but computational limitations prevent solutions with realistic physical properties. Consequently, efforts to apply these models to geological records of the magnetic field are open to question. An alternative approach relies on a stochastic model, which takes advantage of a separate of time scales between short-period convection fluctuations and long-period variations in the dipole field. A stochastic model is constructed from geological observations to provide a simple interpretation of magnetic reversals. Allowing for correlated noise in the stochastic model offers insights into the lifetime of convective eddies in the liquid metal core.
Improved Post Hoc Analysis via Lagrangian Representations
Thursday, Nov. 6, 2014, 12–1 p.m., Bldg. 50F, Room 1647
Alex Agranovsky, University of California, Davis
Fluid mechanics considers two frames of reference for an observer watching a flow field: Eulerian and Lagrangian. The former is the frame of reference traditionally used for flow analysis, and involves extracting particle trajectories based on a vector field. With this work, we explore the opportunities that arise when considering these trajectories from the Lagrangian frame of reference. Specifically, we consider a form where flows are extracted in situ and then used for subsequent post hoc analysis. We believe this alternate, Lagrangian-based form will be increasingly useful, because the Eulerian frame of reference is sensitive to temporal frequency, and architectural trends are causing temporal frequency to drop rapidly on modern supercomputers. We support our viewpoint by running a series of experiments, which demonstrate the Lagrangian form can be more accurate, require less I/O, and be faster when compared to traditional advection.
Link of the Week: What's in a (Supercomputer) Name?
Ever wondered how NERSC supercomputers get their names? Services Department Head Katie Antypas explains the how and why behind computer names like "Hopper" and "Carver." In the piece she relates an interesting story about kids, computers and why names matter. »Read more.