A-Z Index | Phone Book | Careers

InTheLoop | 03.19.2012

March 19, 2012

ASCR Facilities Division Director Position Open

The Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Facilities Division is seeking a motivated and highly qualified individual to service as the Director. The Facilities Division Director is responsible for the overall management of the ASCR facilities, including strategic planning, budget formulation and execution, project management, program integration with other Office of Science activities, interagency and international liaison, and management of the federal and rotator technical and administrative staff in the ASCR program office.

ASCR Associate Director Daniel Hitchcock is seeking suggestions for candidates for this position, including self nominations. Each individual suggested will be sent a letter providing information on the position and how to apply and will be encouraged to submit an application. Please send your suggestions to daniel.hitchcock@science.doe.gov, with an e-mail address for each candidate, as soon as possible.

For more information on the ASCR Facilities Division program, go here. To apply for the position online, go here. The application deadline is April 9, 2012.

CCSE Awarded Blue Waters Early Science Program Allocation

The Center for Computational Sciences and Engineering (CCSE) in the Computational Research Division has been awarded 24 million CPU hours on the Blue Waters Early Science System at the University of Illinois’ National Center for Supercomputing Applications. The project, entitled “Type Ia Supernovae Simulation,” is a collaboration with Chris Malone and Stan Woosley of the University of California at Santa Cruz, and Mike Zingale of Stony Brook University. CCSE will be running supernovae explosion simulations over an eight-week period at unprecedented ~100 m resolution using the CASTRO code to study early post-ignition dynamics.

Also, a little farther north in Illinois, a CCSE animation of a low-Mach-number combustion simulation is being shown on two large screens as part of a fire exhibition at the Chicago Museum of Science and Industry.

SC12 Submissions Open for Papers, Panels, Tutorials and Workshops

Submissions are now being accepted for the technical program for SC12, the international conference on high performance computing, networking, storage and analysis. SC12 will take place Nov. 10–16 at the Salt Palace Convention Center in Salt Lake City. Abstracts for technical papers and ACM Gordon Bell Prize nominations are due Friday, April 20. Full final papers and ACM Gordon Bell Prize nominations are due Friday, April 27, as are submissions for panels, tutorials and workshops. Abstracts for the SCinet Research Sandbox are also due Friday, April 27. The State of the Practice track introduced in SC11 is now incorporated as an area under technical papers at SC12 and has the same submission deadlines as technical papers. Submissions for all Technical Program areas are via https://submissions.supercomputing.org/. Read more.

This Week’s Computing Sciences Seminars

Cloud Seminar: Running the Largest HDFS Cluster
Monday, March 19, 11:00 am–12:00 pm, 380 Soda Hall, UC Berkeley
Hairong Kuang, Facebook

HDFS is a highly scalable fault-tolerant distributed file system designed for running on low-cost commodity hardware. HDFS creates multiple replicas of data blocks and distributes them on compute nodes throughout a cluster to enable fast computation on large data sets.

Facebook uses HDFS as a data warehouse, storing Hive tables that collect all Facebook user behaviors from the Facebook`s front pages. The warehouse HDFS cluster is consisted of more than thousands of nodes, configured with close to 100PB of storage space. Currently the cluster stores hundreds million files, with a growth rate of around hundreds TB of physical space each day. Meanwhile the cluster services a huge load of I/O requests.

The Facebook warehouse HDFS cluster by far is the largest HDFS in the world in term of its capacity. Keeping such a huge file system up and running quickly and reliably gives exciting and interesting challenges to the HDFS team. This talk will present more information on the scale of the cluster, the problems we face, and the solutions we come up to improve the efficiency and the scale of the cluster.

Self-Stabilization: An Approach to Building Robust Software
Monday, March 19, 4:00–5:30 pm, 306 Soda Hall, UC Berkeley
Brian Demsky, UC Irvine

Self-stabilizing programs automatically recover from state corruption caused by software bugs to reach the correct state. A number of applications are inherently self-stabilizing — these programs typically overwrite all non-constant data with data derived from new input data. We present a type system and static analysis that together check whether a program is self-stabilizing. We combine this with a code generation strategy that ensures that a program continues executing long enough to self-stabilize. Our experience using SJava indicates that the annotations are easy to write once one understands a program and SJava successfully verified that our benchmarks were self-stabilizing.

Interface Tracking in Multiphase Physics: Geometry, Foams and Thin Films: LAPACK Seminar
Wednesday, March 21, 12:10–1:00 pm, 380 Soda Hall, UC Berkeley
Robert Saye, UC Berkeley and LBNL/CRD

Many scientific and engineering problems involve interconnected moving interfaces separating different regions, including dry foams, crystal grain growth and multi-cellular structures in man-made and biological materials. Producing consistent and well-posed mathematical models that capture the motion of these interfaces, especially at degeneracies, such as triple points and triple lines where multiple interfaces meet, is challenging.

We introduce (Saye and Sethian, PNAS, 2011) a new computational method for tracking the interface in general multiphase problems. It combines properties of level set methods with a geometric construction, yielding a robust, accurate and efficient numerical method that automatically deals with evolution of triple points/lines and topological change in the multiphase system. We present applications in geometric and fluid flow problems that show many of the method's virtues, including a model for liquid drainage in an unstable foam leading to thin-film interference patterns and bubble bursting. This work is joint with J. Sethian of UC Berkeley/LBNL.

About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.