A-Z Index | Phone Book | Careers

InTheLoop | 08.18.2014

August 18, 2014

Berkeley Lab’s SPOT Suite Transforms Beamline Science

For decades, synchrotron light sources have been operating on a manual grab-and-go data management model—users travel thousands of miles to run experiments at the football-field-size facilities, download raw data to an external hard drive, then process and analyze the data on their personal computers, often days later. But, a recent deluge of data—brought on by faster detectors and brighter light sources—is quickly making this practice implausible.
 
Fortunately, ALS X-ray scientists, facility users, computer and computational scientists from Berkeley Lab’s Computational Research Division (CRD) and the National Energy Research Scientific Computing Center (NERSC) recognized this developing situation years ago and teamed up to create new tools for reducing, managing, analyzing and visualizing beamline data. The result of this collaboration is SPOT Suite, and it is already transforming the way scientists run their experiments at the ALS. »Read more.

Computational Cosmologists Honored by NASA

Julian Borrill, Reijo Keskitalo and Ted Kisner have received a NASA Group Achievement Award for "outstanding supercomputing support provided to the US Planck Team." The team was honored for their work in support of the first Planck data release. They will accept their award at a September 16 ceremony at NASA's Jet Propulsion Laboratory in Southern California. »Read more about Berkeley Lab's contribution to the Planck Data Release.

Julian Borrill to Head Computational Cosmology at Berkeley Lab

Senior Scientist Julian Borrill has been selected to head Berkeley Lab’s Computational Cosmology Center (C3). Effective August 4, he will lead all technical and administrative aspects of C3—including engaging in scientific program development, and serving as principal investigator for proposals and projects. Borrill formerly co-led C3 with Peter Nugent, who was promoted to the Computational Research Division's Deputy for Scientific Engagement earlier this year. »Read more.

Apply Now: 2015 Luis W. Alvarez Fellowship in Computing Sciences

Berkeley Lab’s Computing Sciences is accepting applications for the Luis W. Alvarez Postdoctoral Fellowship in Computing Sciences. This fellowship provides recent graduates (within the last three years) opportunities to work on some of the most important research challenges in computing sciences—from the architecture and software of next generation high performance computing systems and networks, to mathematical modeling, algorithms, and applications of advanced computing, material science, biology, astronomy, climate change and other scientific domains. Applications are due November 26, 2014 for Fall 2015. »Learn more and apply.

Computing Sciences ALD Kathy Yelick Meets with U.S. Secretary of Commerce

Associate Lab Director for Computing Sciences Kathy Yelick participated with Commerce Secretary Penny Pritzker in a meeting focused on a trade and energy mission U.S. Representative Barbara Lee is leading to Africa. Yelick will represent Berkeley Lab as a part of the delegation. »Read more.

This Week's Computing Sciences Seminar

Avoiding Communication, Tuning Precision, and Reproducibility at Exascale

Friday, August 22, 11:00am - 12:30pm, Bldg. 50B, Room 4205
James Demmel, Professor of Mathematics and Computer Science, University of California, Berkeley

We survey three research areas that have particular relevance to computing at exascale. The first area is motivated by the cost of communication, i.e.data movement, between levels of a memory hierarchy or between processors over a network. This cost, measured in time or energy, has long been increasing relative to the cost of arithmetic, and is the bottleneck in many algorithms. We have developed lower bounds on communication, and new algorithms that attain them, for many algorithms from direct and iterative linear algebra, and more recently for general programs that access arrays. The second area is reducing the precision of intermediate variables and operations needed to get an acceptable final result, again motivated by reducing the cost of moving, storing and computing with more data than necessary. We will describe a tool we are developing to automate this analysis and find the least precision required. Finally, we discuss reproducibility, i.e. getting the same answer when you run a program more than once, despite changes in scheduling of computing resources, causing different round off and branching to occur. Reproducibility can be important both for debugging and correctness in some applications.