A-Z Index | Phone Book | Careers

InTheLoop | 03.14.2011

March 14, 2011

LBNL Experts Edit, Contribute to Book on Performance Tuning

David Bailey and Sam Williams of CRD recently co-edited a book called Performance Tuning of Scientific Applications, which presents current research in performance analysis from some of the most notable experts in the field. (Robert Lucas, Director of Computational Sciences at the Information Sciences Institute, was also an editor for the book.) Performance analysis has grown into a full-fledged, sophisticated field of empirical science. Describing useful research in modern performance science and engineering, this book helps real-world users of parallel computer systems to better understand both the performance vagaries arising in scientific applications and the practical means for improving performance.

In addition to Bailey and Williams, other Berkeley Lab contributors to the book include Associated Lab Director for Computing Sciences Kathy Yelick, as well as Mark Adams, John Bell, Vincent Beckner, Jonathan Carter, Khaled Ibrahim, Juan Meza, John Shalf, Hongzhang Shang, Erich Strohmaier, Lenoid Oliker, Lin-Wang Wang, Harvey Wasserman, and Zhengji Zhao. Read more.

Free NERSC Training for Chemistry and Materials Applications

Register now for the upcoming NERSC training “Chemistry and Material Sciences Applications.” The event will be held on April 5, 2011 (postponed from March 22) and hosted simultaneously at the Oakland Scientific Facility and via the Web. Registration and attendance are free of charge. Go here for details and here for registration.

ESnet Is Accepting Research Proposals for Networking Testbed

ESnet is accepting research proposals for the Advanced Networking Initiative Testbed until April 1, 2011. Researchers will have the opportunity to prototype, test, and validate cutting-edge networking concepts. Go here for further information.

Computing Sciences Has a Facebook Fan Page

Become a Facebook fan of Berkeley Lab Computing Sciences and get all of the latest news about NERSC, CRD, and ESnet. Start here.

This Week’s Computing Sciences Seminars

Molecular-Level Modeling of Structure and Proton Transport in Polymer Electrolyte Membrane and Aqueous Systems
Monday, March 14, 9:30–10:30 am, 50F-1647
Myvizhi Esai Selvan, University of Tennessee

Proton exchange membrane (PEM) fuel cells are an eco-friendly power source that has great potential to reduce our oil dependence and carbon emissions. To make PEM fuel cells an economically viable option, it is mandatory to improve ionic conduction of the membranes under wide operating conditions. Molecular-level simulation of nanoscale structure and transport mechanisms within the membrane electrode assembly (MEA) of PEM fuel cells provides a fundamental understanding of the key structure/property relationships that is necessary for the design and synthesis of superior characteristic membranes.
To attain a basic understanding of the correlation between the morphology of the membrane and ionic conduction two sequential tasks were performed. The first task was to generate atomistic descriptions of the morphology of many of the phases and interfaces present in the MEA. Classical molecular dynamics simulations were performed to examine the structural properties of Nafion/vapor phase interface under water contents of 5%, 10%, 15%, and 20% by weight. A region of water depletion was found at the Nafion/vapor interface. A preferential orientation of hydronium ions was also observed at the interface, with their oxygen extended into the vapor phase. The second task was to model proton transport through the aqueous nanochannels of PEM. Proton transport in an aqueous media occurs through a combination of conventional diffusion (vehicular diffusion) and hopping mechanism (structural diffusion). A reactive molecular dynamics (RMD) algorithm, which is based on a mapping of the quantum mechanically determined transition states onto nonreactive potentials, was employed to capture the overall proton transport. The proton transport in PEM is governed by acidity of the protogenic groups and confinement into aqueous nanoscaled domains. Therefore, systems in which the acidity and confinement can be independently varied, including bulk water (at temperatures from 280 K – 320 K), aqueous hydrochloric acid solutions (at concentrations 0.22 M – 0.83 M) and water confined in carbon nanotubes (at radii 5.42 Å – 10.85 Å) were also examined in addition to the application in PEM. Increasing the acidity or confinement was found to change the local energetic landscape reducing the structural diffusion but had very little effect on vehicular diffusivity. The correlation between the two components of charge diffusion and their individual contribution to the charge diffusion was also explored for a basic understanding of the proton transport mechanisms.

These important studies will eventually provide guidance for the development of next generation PEMs.

LAPACK Seminar: Updated Sparse Cholesky Factors for Co-Rotational Elastodynamics
Wednesday, March 16, 11:10 am–12:00 pm, 380 Soda Hall, UC Berkeley
Florian Hecht, UC Berkeley

I will present work was that recently submitted to SIGGRAPH on “Updated Sparse Cholesky Factors for Co-Rotational Elastodynamics.” We have developed a way to incrementally update the Cholesky factor of a sparse matrix for the simulation of elastic materials. Instead of updating the complete factor, we can do partial updates at a small and controllable sacrifice in accuracy. With these updates we can simulate objects faster than with a traditional conjugate gradient (CG) iterative solver. Furthermore with a direct Cholesky solver we make our solution times independent of material parameters and mesh quality. Our method also scales better for larger meshes than CG. I will explain in detail how efficient sparse direct solvers work and how we can use the same structures to do partial updates. I will also show the modifications to the base co-rotational method for elastodynamics that make this update scheme possible. Our factorization, solve and update routines are parallelized and I will talk about difficulties to achieve high performance on many cores.

Google Innovation: Culture and Practices
Wednesday, March 16, 12:00–1:00 pm, Banatao Aud., Sutardja Dai Hall, UC Berkeley
Live broadcast at mms://media.citris.berkeley.edu/webcast
Dan Russell, Research Scientist and Search Anthropologist, Google

As a company, Google clearly relies on innovation to keep our business alive and growing. Translating that desire into a continual innovation practice is central to the outlook and world-view that Google has as a corporate culture. Innovation isn’t just for the futurists, but a part of what everyone in the company is expected to do on a day-to-day basis. People who work on internal processes, for example, are expected to be as innovative as engineers and product managers who drive externally visible products. Innovation isn’t something that the company can just leave to a few bright minds, but is deeply embedded in the culture of the company.

Beyond culture, though, there are a few pragmatic behaviors that help Google be innovative. A commonplace belief is that innovation originates with an identified market or user need. While we design for the user, we recognize that innovative ideas originate in many places—sometimes with user needs, but also occasionally from technology opportunities that suddenly become available. In these cases, the user need might not be clearly identified at the outset of research, but become evident only over time. Ultimately, of course, an innovation has to be user-relevant, but we understand that not everything starts that way.

One of the key drivers of Google innovation is our focus on data-driven analytics of our products. We instrument just about everything we can think of, log the data (anonymizing along the way to preserve privacy), then analyze it extensively. We recognize that innovation often proceeds in an evolutionary fashion, and that apparently large leaps in design and novel concepts are often hidden beneath a great deal of under-the-covers work the precedes the public announcement.

In user-interface design, for example, we don't just do A/B testing, but often A/B/C/D/E/F/... testing. And one of the deep lessons of such an extensive testing program is that we recognize that our intuitions are often incorrect. Large changes in the design may very well lead to poor performance shifts, while tiny, sometimes imperceptible changes can have profound consequences. In many of our products, the UI changes significantly over time, particularly as we learn from our experiments, but also as new technology and data becomes available.

Innovation is thus often smoothly evolutionary, albeit looking like punctuated evolution from the outside, but driven by continual rapid iteration and redesign, always driven by an objective function that includes goodness-of-fit to the environment and exaptation of opportunities as they arise.

Finally, we find that innovative products really are the product of many minds. A very small team might drive the initial design and creation of the concept, but having multiple people look at, evaluate, comment-upon and lend supporting insights is valuable. The trick is to allow these additional insights to be supportive, and not weigh the original ideas down with extraneous freight. Keeping an innovation clear, clean and useful to the consumer is an important practice to avoid losing the key insight and value in the innovation.

CITRIS Distinguished Speaker Series: The Data and Compute-Driven Transformation of Modern Science
Thursday, March 17, 12:00–1:00 pm, Banatao Aud., Sutardja Dai Hall, UC Berkeley
Live broadcast at mms://media.citris.berkeley.edu/webcast
Edward Seidel, Assistant Director for Mathematical and Physical Sciences, National Science Foundation

Modern science is undergoing a profound transformation as it aims to tackle the complex problems of the 21st Century. It is becoming highly collaborative; problems as diverse as climate change, renewable energy, or the origin of gamma-ray bursts require understanding processes that no single group or community has the skills to address. At the same time, after centuries of little change, compute, data, and network environments have grown by 12 orders of magnitude in the last few decades. Cyberinfrastructure—the comprehensive set of deployable hardware, software, and algorithmic tools and environments supporting research, education, and increasing collaboration across disciplines—is transforming all research disciplines and society itself. Motivating with examples ranging from astrophysics to emergency forecasting, I will describe new trends in science and the need, the potential, and the transformative impact of cyberinfrastructure. I will also discuss current and planned future efforts at the National Science Foundation to address them.

Current Status of Coherent Large-Scale InP Photonic Integrated Circuits
Friday, March 18, 11:00 am–12:30 pm, 521 Cory Hall (Hogan Room), UC Berkeley
Fred A. Kish, Infinera Corporation

The current state-of-the-art for large-scale InP photonic integrated circuits (PICs) is reviewed with a focus on the devices and technologies that are driving the commercial scaling of these highly integrated devices. Specifically, high-capacity dense wavelength division multiplexed (DWDM) transmitter and receiver photonic integrated circuits (PICs) are reviewed with a focus next generation devices: >500 Gb/s coherent multi-channel transmitter and receiver InP PICs. These large-scale PICs integrate hundreds of devices onto a single monolithic InP chip and enable significant reductions in cost, packaging complexity, size, fiber coupling, and power consumption which enable benefits at the component and system level.

About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.