Berkeley Lab Team Wins Best Paper Award at Cloud Computing Conference
June 28, 2010
Contact: Jon Bashor, email@example.com, 510-486-5849
A team of researchers from Lawrence Berkeley National Laboratory has received the Best Paper Award at ScienceCloud 2010, the 1st Workshop on Scientific Cloud Computing sponsored by the Association for Computing Machinery. The paper, “Seeking Supernovae in the Clouds: A Performance Study,” was written by Keith Jackson and Lavanya Ramakrishnan of the Advanced Computing for Science Department (ACS), Karl Runge of the Physics Division and Rollin Thomas of the Computational Cosmology Center (C3).
“What a wonderful accomplishment at the very first Scientific Cloud Computing workshop,” said Horst Simon, director of the Computational Research Division, which is home to both ACS and C3. “This paper builds on the strong foundation we are building in the area of scientific cloud computing.”
In their paper, the authors write that the discovery of Dark Energy, "was made by comparing the brightness of nearby Type Ia supernovae (which exploded in the past billion years) to that of much more distant ones (from up to seven billion years ago). The reliability of this comparison hinges upon a very detailed understanding of the physics of the nearby events. As part of its effort to further this understanding, the Nearby Supernova Factory (SNfactory) relies upon a complex pipeline of serial processes that execute various image processing algorithms in parallel on approximately 10TBs of data.”
In the past, this data pipeline was typically fed into a cluster computer, but the authors found that "Cloud computing offers many features that make it an attractive alternative. The ability to completely control the software environment in a Cloud is appealing when dealing with a community developed science pipeline with many unique library and platform requirements."
For the project, the team studied the feasibility of porting the SNfactory pipeline to the Amazon Web Services environment and described the tool set developed to manage a virtual cluster on Amazon EC2. The paper explores the various design options available for application data placement, and offers detailed performance results and lessons learned from these design options.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 7,000-plus scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are Department of Energy Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.