A-Z Index | Phone Book | Careers

InTheLoop | 12.17.2012

December 17, 2012

Kathy Yelick Named ACM Fellow for Contributions to Parallel Languages

Associate Lab Director for Computing Sciences Kathy Yelick has been named a Fellow of the Association for Computing Machinery (ACM). Yelick was named an ACM Fellow “For contributions to parallel languages that improve programmer productivity.” Yelick was one of 52 ACM Fellows announced Tuesday, Dec. 11.

The letter nominating her began, "Dr. Yelick's research contributions have improved fundamental understanding and practice of parallel programming, performance tuning, compilation, and runtime systems. Dr. Yelick is an innovative researcher, an effective teacher and mentor, and a world leader in the field of high performance computing.  Her software is used in both the research community and in production environments, and her publications currently have 6992 citations, an H-index of 39 and a G-index of 81.  She gives frequent invited talks, including conference keynotes and distinguished lectures. She is involved in advisory committees at the national and California state level, and as the Associate Laboratory Director for Computing Sciences at Lawrence Berkeley National Laboratory, is responsible for setting the computing research directions of the Laboratory." Read more.


Wes Bethel Is Named ACM Distinguished Scientist

Wes Bethel, leader of the Visualization Group in the Computational Research Division, has been named an ACM Distinguished Scientist. The Distinguished Scientist level recognizes “Those ACM members with at least 15 years of professional experience and 5 years of continuous professional membership who have achieved significant accomplishments or have made a significant impact on the computing field.”

In the letter nominating Bethel, he was cited for his work on visualization and data analysis for nearly 30 years and being “a well-regarded leader of scientific visualization.” During his 10-year leadership of the Visualization Group, the group has grown from three staff members to 14, and the team regularly produces field-leading work recognized via best paper awards at conferences. Bethel is responsible for coordinating and leading or co-leading a number of multi-institutional research activities, such as the “Scalable Data Management, Analysis and Visualization Institute” and “Visual Data Exploration and Analysis of Ultra-large Climate Data.”

The SciDAC Visualization and Analytic Center for Enabling Technologies, for which Bethel was Coordinating Principal Investigator during the period 2006–11, was the largest-ever DOE-funded open science visualization-focused project in dollar amount, and focused on making production-quality, petascale-capable visualization a reality. He oversees and directly or indirectly manages a portfolio of over a dozen research projects with a total annual budget of more than $5M. His research group at Berkeley Lab produced 66 technical publications during 2011–2012, including a new book on High Performance Visualization from CRC Press, which describes seminal work in the field.


20-Year-Old Report Successfully Predicted Global Warming

Time has proven that even 22 years ago, climate scientists understood the dynamics behind global warming well enough to accurately predict warming, says an analysis that compares predictions in 1990 with 20 years of temperature records. After an adjustment to account for natural fluctuations, the predictions and the observed increases matched up, the current research found.

The predictions in question come from the first climate assessment report issued by the Intergovernmental Panel on Climate Change (IPCC) in 1990. The analysis, co-authored by Daithi Stone of the Computational Research Division, was published online in the journal Nature Climate Change on Dec. 9. Read more.


Exabyte Problem: Climate Scientists Grapple With a Deluge of Data

Climate science is a computationally intense discipline. The entire idea is to figure out what a massively complex system—essentially, the world—is going to do based on hundreds of different variables, including carbon dioxide concentrations, cloud cover, airplane contrails, and so on. And the scientific community's means for measuring those variables has improved dramatically in recent years, with satellites and any number of terrestrial sensors multiplying all the time. This is a good thing in principle, and a very complicated thing in practice.

Data volumes are increasing far faster than computer power, due to improvements in sensors. To address this challenge, many groups have started developing tools aimed specifically at helping computing power catch up with the available data. Prabhat of the Computational Research Division is one of the experts quoted in an IEEE Spectrum article that discusses these tools, including Berkeley Lab’s TECA: Toolkit for Extreme Climate Analysis. Read more.


NERSC 2013 Allocation Awards Are Announced

The NERSC 2013 allocation awards have been announced. To see the projects, principal investigators, and computer time awarded, go here.


Ladder Safety Reminders

Many CS employees use stepladders and have taken the EHSS Ladder Safety Training course. This course is now online, so if you would like to review ladder safety practices, you can do so easily. Remember, falls from even the first or second step of a ladder can have serious consequences.

According to the World Health Organization, the United States leads the world in ladder deaths. Each year, there are more than 164,000 emergency room-treated injuries and 300 deaths in the U.S. that are caused by falls from ladders. So whether you are using a ladder at work or at home, it is important to practice good ladder safety.

Recently, the United Kingdom Ladder Association held a contest to determine the "Biggest Idiot on a Ladder." Entries were posted on the Ladder Association's Facebook page where voting was used to select winners. The Ladder Association's Cameron Clow states that “Reported ladder accidents have fallen by over 30% over the last 10 years, but more clearly needs to be done. While it is already known that falls from height are the number one cause of death at work, the number and variety of the pictures are startling evidence that dangerous ladder use is occurring daily.”


This Week’s Computing Sciences Seminars

Reducing Memory and Communication Requirements in Linear Algebra

Monday, Dec. 17, 11:00 am–12:00 pm, 50F-1647
Mathias Jacquelin, INRIA

With the Exascale on sight, it is primordial to ask ourselves the question of whether current state-of-the-art algorithms will scale or not. Current and forthcoming platforms implement deeper and deeper hierarchies, both in terms of memory and of network topology. Classical algorithms designed for 2D grids are not well adapted to these new architectures and thus need to be revisited. Moreover, the available amount of memory per core will likely get smaller on larger platforms, and this trend needs to be taken into consideration.

Target HPC applications make an extensive use of linear algebra methods, thus making them good candidates for improvement. In this talk, I will present three studies belonging to dense and sparse linear algebra. The focus will be set on reducing the communication needs of dense QR and LU factorizations as well as minimizing the memory usage of sequential multifrontal sparse factorizations.

Visualization of Static and Time-Varying High-Dimensional Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

Monday, Dec. 17, 2:00–3:00 pm, 50B-4205
Patrick Oesterling, University of Leipzig, Germany

Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity. We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phase utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. This analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape. The second part of this talk presents ongoing work on extending the static analysis of one time-step to support cluster analysis of time-varying point clouds, described by the temporal evolution of the density function's topology.

Physics in Screening Environments

Tuesday, Dec. 18, 11:00 am–12:00 pm, 50B-4205
Ondrej Certik, University of Nevada, Reno

In the current study, we investigated atoms in screening environments like plasmas.

It is common practice to extract physical data, such as temperature and electron densities, from plasma experiments. We present results that address inherent computational difficulties that arise when the screening approach is extended to include the interaction between the atomic electrons. We show that there may be an ambiguity in the interpretation of physical properties, such as temperature and charge density, from experimental data due to the opposing effects of electron-nucleus screening and electron-electron screening.

The focus of the work, however, is on the resolution of inherent computational challenges that appear already at the Hartree-Fock level. Furthermore, as examples of post Hartree-Fock calculations, we show second-order Green's function results and many body perturbation theory results of second order.

In the second part of the talk I will describe our work at LLNL, where the goal was to provide faster and more robust Dirac solver for a self-consistent average-atom plasma physics code. We wrote a robust and general Schrödinger and Dirac solver for atomic structure calculations, which uses a shooting method and density functional theory. We also wrote spectral finite element solver for Schrödinger and Dirac equations in density functional theory.

The Materials Project: Combining Quantum Chemistry Calculations with Supercomputing Centers for New Materials Discovery

Tuesday, Dec. 18, 12:00–1:00 pm, 90-3122
Anubhav Jain, LBNL/CRD

New materials can potentially reduce the cost and improve the efficiency of solar photovoltaics, batteries, and catalysts, leading to broad societal impact. This talk describes a computational approach to materials design in which density functional theory (DFT) calculations are performed over very large computing resources. Because DFT calculations accurately predict many properties of new materials, this approach can screen tens of thousands of potential materials in short time frames.

We present some major software development efforts that generated over 8 million CPU-hours worth of materials information in the span of a few months, identifying several new Li ion battery cathode materials that were verified experimentally. This represents one of the largest materials data sets ever computed, and the results are compiled on a public web site (The Materials Project) with over 3,000 registered users that are designing new materials with computed information.

Finally, we describe future efforts in which algorithms might "self-learn" which chemical spaces are the most promising for investigation based on the results of previous computations, with application to solar water splitting materials.

Fast Parallel Solution of Heterogeneous 3D Time-Harmonic Wave Equations

Wednesday, Dec. 19, 2:00–3:00 pm, 50B-4205
Jack Poulson, University of Texas at Austin

Several advancements related to the solution of 3D time-harmonic wave equations are presented, especially in the context of a parallel moving-PML sweeping preconditioner for wave propagation problems without large-scale resonances. In particular, heterogeneous 3D wave equations involving nearly a billion degrees of freedom can now be solved in the frequency-domain in just a few minutes using several thousand processors. High-performance parallel algorithms for multifrontal triangular solves with many right-hand sides will also be discussed, as well as a custom Kronecker-product compression scheme which builds upon the translation invariance of free-space Green’s functions.


Link of the Week: The Uses of Difficulty

An article in Intelligent Life magazine contents that the brain likes a challenge—and putting a few obstacles in its way may well boost its creativity.

Our brains respond better to difficulty than we imagine. In schools, teachers and pupils alike often assume that if a concept has been easy to learn, then the lesson has been successful. But numerous studies have now found that when classroom material is made harder to absorb, pupils retain more of it over the long term, and understand it on a deeper level. Robert Bjork, of the University of California, coined the phrase “desirable difficulties” to describe the counter-intuitive notion that learning should be made harder by, for instance, spacing sessions further apart so that students have to make more effort to recall what they learnt last time. Read more.



About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.