InTheLoop | 04.12.2010
April 5, 2010
NERSC and JGI Form Closer Partnership
The computational and storage challenges associated with genomics research at the DOE Joint Genome Institute (JGI) are growing rapidly in response to the scientific demand and increased capability of modern sequencers. The NERSC Division and JGI plan to combine forces to address these challenges, combining the expertise at NERSC in running a large production computational facility with the expertise in genomics at JGI. NERSC and JGI have agreed that NERSC will assume responsibility for support of JGI servers, and that the six-person JGI systems staff will be transferred to NERSC Division.
“This move will bring to NERSC the domain-specific knowledge within the JGI team about the genomics requirements, which is quite different from NERSC’s more traditional HPC workload,” said NERSC Division Director Kathy Yelick when making the announcement on April 9. “The science is heavy in data analytics, and JGI runs several dozen databases and web portals providing access to genomic information. Data-centric computing is an important future direction for NERSC, and genomics is seeing exponential increases in computation and storage….
“We will determine the final organizational structure after we review current challenges, needs of the scientific programs supported by JGI, and future computational demands,” Yelick added. More details on the partnership will be available soon. NERSC will welcome the JGI IT group at a party from 3:30 to 4:30 pm on Friday, April 16, at OSF 238.
Vis/Analytics Group Helps Launch New Data Analysis Center
Researchers in CRD’s Visualization and Analytics Group will receive approximately $1 million over the next four years to help establish a new, state-of-the-art visualization data analysis center aimed at interpreting the massive amounts of data produced by today’s most powerful supercomputers.
Called Remote Data Analysis and Visualization (RDAV), the center will be located at the University of Tennessee’s (UT) National Institute for Computational Science. The center will be built by a collaboration of researchers from UT, Berkeley Lab, University of Wisconsin at Madison, University of Illinois’ National Center for Supercomputing Applications, and Oak Ridge National Laboratory. Read more.
Analytics Staff to Speak at SIAM Conference on Imaging Science
Daniela Ushizima and Mark Howison, both members of the NERSC Analytics Team and CRD Visualization Group, will present papers in a session (organized by Ushizima) on “Modeling and Analysis of Biomedical Images” at the SIAM Conference on Imaging Science (IS10), April 12-14 in Chicago.
Ushizima will present a paper on “Retinopathy Diagnosis from Ocular Fundus Image Analysis,” co-authored with Fatima Medeiros of the Federal University of Ceará, Brazil. Howison will present a paper on “Comparing GPU Implementations of Bilateral and Anisotropic Diffusion Filters for 3D Biomedical Datasets.”
CS Staff Presenting Five Papers at IPDPS
Berkeley Lab Computing Sciences staff will be presenting five papers at the 24th IEEE International Parallel and Distributed Processing Symposium (IPDPS) April 19–23 in Atlanta:
- Samuel Williams and Leonid Oliker co-authored “Optimizing and Tuning the Fast Multipole Method for State-of-the-Art Multicore Architectures” along with Aparna Chandramowlishwaran, Ilya Lashuk, George Biros, and Richard Vuduc of Georgia Tech.
- Shoaib Kamil, Cy Chan, Leonid Oliker, John Shalf, and Samuel Williams co-authored “An Auto-Tuning Framework for Parallel Multicore Stencil Computations.”
- Andrew Uselton, Mark Howison, Nicholas J. Wright, David Skinner, Noel Keen, John Shalf, Karen L. Karavanic (of the San Diego Supercomputer Center), and Leonid Oliker co-authored “Parallel I/O Performance: From Events to Ensembles."
- Costin Iancu, Steven Hofmeyr, Yili Zheng, and Filip Blagojevic co-authored "Oversubscription on Multicore Processors."
- Deb Agarwal and Keith Jackson, along with collaborators from the University of Virginia, Microsoft Research, and UC Berkeley, co-authored "eScience in the Cloud: A MODIS Satellite Data Reprojection and Reduction Pipeline in Windows Azure Platform."
Koniges to Talk on Fusion Computing at Sherwood Conference
Alice Koniges of NERSC will give an invited presentation at the 2010 International Sherwood Fusion Theory Conference, which will be held April 19–21, 2010 in Seattle, Washington. The title of her talk is “What’s Ahead for Fusion Computing.” It was co-authored with John Shalf and Robert Preissl of NERSC, Stephane Etheir of Princeton Plasma Physics Lab, and the Cray Center of Excellence at NERSC and NERSC Cloud Computing teams.
CS Mentoring Program Seeks Mentors, Protégés
Sign-ups are open for a new round of mentoring in the CS Mentoring Program. Volunteer mentors and protégés should sign up by May 1 for pairings that will meet from late May to December. An overview of the program along with sign-up forms for mentors and protégés can be found on the CS Staff web page. Send any questions to email@example.com.
This Week’s Computing Sciences Seminars
Characterization of Turbulent Explosions and Their Interaction with Solid Particle Clouds
Thursday, April 15, 10:00–11:00 am, 50F-1647
Kaushik Balakrishnan, Georgia Institute of Technology
Explosions are common in the battlefield, coal mines, petroleum industry, etc., and are not sufficiently studied by the research community; in particular, the effects of turbulence and multiphase environments have not been addressed to detail. This investigation focuses on characterizing the flow-field behind explosions by using a robust two-phase hybrid formulation and Large-Eddy Simulations (LES), with state-of-the-art models to account for the various thermo-fluid dynamic phenomena. A new Eulerian-Lagrangian two-phase formulation is proposed, then extended to an LES framework, and is applied to study the problem of multiphase explosions. It is observed that explosion products mix with the ambient air primarily through “macroscopic mixing” at early times, resulting in afterburn and exothermic energy release that significantly affects the flow-field. Furthermore, due to the high densities behind explosions, the interface that separates the explosion products and air is susceptible to hydrodynamic instabilities, viz. Rayleigh-Taylor and Richtmyer-Meshkov. When inert or reactive solid particles are also present, they pick up momentum and disperse, pick up heat and ignite (if reactive), resulting in a flow-field that is two-way coupled between the two phases. Both dilute and dense particle clouds are studied, and it is observed that the turbulence intensities, amount of mixing, and afterburn are significantly enhanced by the particles. The growth of hydrodynamic instabilities, turbulence characteristics, particle dispersion, clustering and ignition in the flow-field behind explosions are studied and elucidated. Overall, this investigation demonstrates a numerical simulation strategy to study turbulent, explosive, reactive, two-phase flow fields.
Link of the Week: How Many Members Should Be in a Cabinet?
Executive power in most governments is conferred on a committee known as a cabinet, consisting of people having, according to Robert Louis Stevenson, the only profession for which no preparation is thought necessary. The question of how many people make an effective cabinet was first tackled in a semi-humorous attempt by British historian C. Northcote Parkinson. His investigations led to what is now known as the “coefficient of inefficiency,” conjecturing that a cabinet loses political grip due to an inability of efficient decision-making as soon as its membership passes a critical size of 19–22.
A new study, “To how many politicians should government be left?” shows that Parkinson’s conjectures about cabinet sizes and government efficiency are empirically supported. The authors interpret their findings through a simple physical model of opinion dynamics in groups.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.