InTheLoop | 09.04.2007
The weekly newsletter for Berkeley Lab Computing Sciences
September 4, 2007
Network Services Expert Joins ESnet
Vangelis Chaniotakis, a senior software engineer from the University of Crete in Greece, has accepted a one year appointment with ESnet to work on a virtual circuits provisioning project.
Chaniotakis started his post at ESnet in August, bringing with him expertise in developing network services for the Greek Research and Technology Network (GRNET), a national network supporting research and academic institutions, and for GÉANT, the pan-European research and education networking organization (similar to Internet2 in the United States).
At ESnet, Chaniotakis will work on the On-Demand Secure Circuits and Advance Reservation System (OSCARS), which allocates network bandwidth to large-scale scientific projects in the form of virtual circuits across multiple networks, ensuring high-performance data transport. ESnet has been working with science network operators from the U.S. and Europe to deploy OSCARS and ensure its interoperability with similar bandwidth provisioning software. More information on OSCARS is available at http://www.es.net/OSCARS/index.html.
Thursday Seminar on Distributed System for Genetic Analysis
Mark Silberstein, a Ph.D. student at Technion (the Israel Institute of Technology), will give a talk on “Superlink-online: A large-scale distributed system for genetic linkage analysis” at 11 a.m. Thursday, Sept. 6, in the 50B-4205 conference room.
Here’s the abstract for his talk:
“Genetic linkage analysis is a statistical tool used by geneticists for mapping disease-susceptibility genes in the study of genetic diseases. However, such analysis is often beyond the capabilities of a single computer. We present a distributed system for faster analysis of genetic data, called Superlink-online. The system achieves high performance through parallel execution of linkage analysis tasks over thousands of computational resources residing in multiple opportunistic computing environments, aka Grids.
“Notably, the system is available online, which allows geneticists to perform computationally intensive analyses with no need for either installation of software, or maintenance of a complicated distributed environment.
“In this talk we will describe the scheduling system architecture which drives Superlink-online. The main challenges have been to efficiently split large tasks for distributed execution in a highly dynamic non-dedicated running environment, and to provide nearly interactive response time for shorter tasks while simultaneously serving massively parallel ones. The system utilizes resources in all the available grids, unifying thousands of CPUs over campus grids in the Technion and the University of Wisconsin in Madison, EGEE grids in Europe, and Community Computing Grid Superlink@Technion.
“The system is being extensively used by medical centers worldwide. Since January 2006, over 12,000 interactive genetic analysis tasks were performed, utilizing over 240 years of CPU time.”
Silberstein’s main research focus has been efficient serial and parallel algorithms for inference in Bayesian networks (in the context of genetic linkage analysis), and their execution in large-scale opportunistic computing environments. He is currently visiting UC Davis, working with Prof. John Owens on the parallelization of Bayesian inference on GPUs.
OSF Brown Bag on Grid Troubleshooting Thursday
This week’s OSF brown bag lunch discussion will be “Centralized Logging for Grid Troubleshooting” with Brian Tierney and Dan Gunter of the Distributed Systems Department in CRD. It will take place from noon to 1 pm Thursday, September 6, in Room 238 at the Oakland Scientific Facility. Here is the abstract:
“Tracking failures across a widely distributed system of resources has proven challenging to many DOE applications. This can be an issue not only for Grid computing but for anyone performing large-scale data transfers to remote machines. A single action such as reliably transferring a directory of files can involve coordinating a wide range of loosely coupled software tools, including security software, delegation services, and file transfer tools. The Open Science Grid (OSG) project, for example, currently experiences a 15% job failure rate.
“As part of the Center for Enabling Distributed Petascale Science (CEDPS) project, we are building an infrastructure to work with current middleware and system software to more easily track failures and discover anomalous behavior. This infrastructure comprises a common logging format, the extension of syslog-ng for centralized collection of data, a data summarizer to more easily manage the volume of logging, and an anomaly detection system that can connect to a warning system when unexpected behaviors occur. We are currently working with OSG to deploy a prototype of the full system.
“This talk will describe our common logging recommendations, our use of syslog-ng, and our methods for locating failures and performance issues.”
For more information, see http://cedps.net/wiki/index.php/LoggingBestPractices.
Anil Deane Ends Term as Math Program Manager in ASCR
Anil Deane, who has served as the Program Manager for Applied Mathematics Research in DOE’s Office of Advanced Scientific Computing Research for the past 19 months, has stepped down and will return to his position as a professor in the Institute for Physical Science and Technology at the University of Maryland, College Park.
“I have been very proud to have had the opportunity to get to know all the fine research carried out in the program and to meet many of you personally,” Deane wrote. “My time in ASCR has been extraordinarily rewarding and successful. Not only did I have the opportunity to serve as the math PM, for a considerable period of time I concurrently served as PM for ASCR partnerships and the math centers and institutes in the SciDAC program, and was the ASCR PM responsible for almost two-thirds of the SciDAC portfolio. SciDAC is the finest computational science research program in the country and deeply involves many from our applied mathematics community.”
Sept. 5 CITRIS Talk to Look at Surveying Sierra Snowpack by Satellite
The Center for Information Technology Research in the Interest of Society (CITRIS) at UC Berkeley kicks off its fall series of Research Exchange talks at noon Wednesday, Sept. 5, with “Patterns in the Sierra Nevada from Blended Satellite and Ground-Based Networks” by Roger Bales, Professor of Engineering at UC Merced. The talk is open to the public and will be held in the Dado and Maria Banatao Conference Room (290) of the Hearst Memorial Mining Building on the UC Berkeley Campus.
Here is the abstract:
“Accurate, frequent satellite-derived snow covered area (SCA) products provide the opportunity to explore the spatial patterns of snow, as well as the impact of snow accumulation and ablation on snow distribution along elevation gradients. Spatial maps of snow weight equivalents highlight elevation bands that contribute significantly to snowmelt across the basin, as well as those elevation bands that are susceptible to warming and thus rapid depletion of the snowcover. Using a season-long energy-balance approach, it is apparent that higher elevations contribute a disproportional share of the basin snowmelt. These results also underscore deficiencies in the current measurement network and provide the impetus for designing of an adequate measurement network along elevational gradients.
“Ongoing deployment of snow and water-balance instrument clusters in the Sierra Nevada are designed to overcome these deficiencies, and provide low-cost, near-real-time information on spatial snow depths.”
The complete schedule of CITRIS talks for the fall semester is online at http://www.citris-uc.org/RE-Fall2007.
August Issue of CRD Report Is Now Online
The August issue of CRD Report is now online. Learn what cool stuff your fellow researchers in Computing Sciences are doing lately. In this issue, you will read about:
*A DOE Computational Science Graduate Fellow's work on bioinformatics.
*Nanorod research that relied on computational methods developed by Denis Demchenko and Lin-Wang Wang.
*A new postdoc/Alvarez Fellow who works with John Bell on numerical methods for chemistry and nanoscience applications.
*Workshops on ACTS Collection and the Bro intrusion detection technology.
The CRD Report can be found at http://crd.lbl.gov/html/news/CRDreport0807.pdf.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.