InTheLoop | 09.27.2010
September 27, 2010
Michael Wehner Joins Congressional Briefing on Extreme Weather in a Warming World
Michael Wehner, a member of the Scientific Computing Group in CRD, who researches extreme weather conditions resulting from global climate change, was one of four panelists providing input during a Congressional briefing on “Extreme Weather in a Warming World.” The Select Committee on Energy Independence and Global Warming, chaired by Rep. Edward J. Markey (D-Mass.) held the briefing on Thursday, Sept. 23, in the Rayburn House Office Building in Washington, D.C. The most prominent speaker was Husain Haqqani, Pakistan’s Ambassador to the United States, who discussed the historic floods that have displaced millions of his countrymen.
Wehner spoke of his research and said that “Changes in the magnitude and frequency of extreme weather associated with changes in the average climate are likely the most serious consequence of human-induced global warming.” Among the effects we are likely to see are more frequent heat waves, intense precipitation, more severe droughts and more frequent and severe tropical cyclones.
Chorin and Sethian Awarded Prestigious Math Prizes
Berkeley Lab’s Alexandre Chorin and James Sethian won prestigious prizes from the International Council for Industrial and Applied Mathematics (ICIAM) for groundbreaking work in applied math, with impacts ranging from fluid mechanics and aerodynamics to medical imaging and semiconductor manufacturing. Chorin won the 2011 Lagrange Prize and Sethian won the 2011 Pioneer Prize. The awards, announced by the ICIAM on September 20, bring to Berkeley Lab two of the five math prizes the organization awards every four years. The ICIAM is composed of many of the national and international associations of professional mathematicians concerned with applications. Read more.
Katie Antypas Appointed NERSC User Services Group Lead
Katie Antypas has been appointed NERSC User Services Group (USG) Lead. In making the announcement, NERSC Division Director Kathy Yelick said, “As a USG member since 2006, [Katie] has been a passionate advocate for the NERSC users, which she has demonstrated in planning activities for the 2010 NERSC Policy Board meeting and the 2010 Operating Assessment review. She has also shown excellent leadership as a member of the NERSC-6 procurement team, co-lead of the NERSC-6 implementation team, and co-lead on a number of the Franklin stabilization ‘tiger’ teams. Please join me in welcoming her to this important position.”
“The Supernova’s Secrets Cracked at Last?” in Time
“The Supernova’s Secrets Cracked at Last?” by Michael D. Lemonick, posted to Time magazine online last Friday (September 24), describes the breakthrough 3D simulations of supernova explosions created by astrophysicists Jason Nordhaus and Adam Burrows of Princeton University and Berkeley Lab mathematicians Ann Almgren and John Bell. These simulations show the turbulence in the collapsing star’s shock wave in greater detail than ever before, pointing toward a more complete explanation of supernovas’ explosive power. A recent Princeton/Berkeley Lab article also described the research.
David Bailey Comments on Pi Calculations in New Scientist
A Yahoo researcher has made a record-breaking calculation of the digits of pi using his company’s computers, calculating the 2 quadrillionth (2 x 1015) binary digit of pi. The feat comes hot on the heels of a breakthrough Rubik’s cube result that used Google’s computers. Together, the results highlight the growing power of internet search giants to make mathematical breakthroughs. CRD Chief Technologist David Bailey, who in 1996 co-discovered the first formula allowing one to skip ahead to compute distant digits of pi, offers his perspective in the New Scientist article “New pi record exploits Yahoo’s computers” by David Shiga.
Jason Hick to Discuss Real-Time Monitoring at HPSS User Forum
NERSC Storage Systems Group Lead Jason Hick will give presentations on “Real-Time Monitoring,” “LBNL Site Presentation,” and “HPSS Requirements and Wish List Discussion” at the HPSS User Forum 2010, which is being held this week (September 27–29) at DKRZ, the German Climate Computing Centre, in Hamburg.
This Week’s Computing Sciences Seminars
Image Analysis of Hyper-/Multi-Spectral, Wood-Core and Biomedical Images
Monday, Sept. 27, 9:00–10:00 am, 50B-2222
Janak Ramachandran, Altria Client Services
Imaging technologies have been a vital tool in the automation of manual laborious processes in several industries. This talk will discuss image processing and machine-learning techniques utilized in four different applications.
The utilization of information in multiple spectral bands as opposed to a single band has led to a higher efficacy in object detection rates and improved discrimination in multiple classes of objects. The goal of the hyper-spectral imaging project was to develop a system that could automatically classify different grades of tobacco leaves. The pattern recognition algorithm that was developed for classification of different grades of tobacco leaves consisted of a learning phase and a testing phase. The fingerprints (spectral signatures) for each tobacco grade depended on the chemical compounds inside the tobacco leaves. The small chemical variations between different grades reflected in their respective fingerprints. Thus, a database of signatures was created and stored during the learning phase and utilized to classify the sample of interest in the testing phase.
For the geospatial imaging project, the main goal was to provide tools to assist the nuclear image analyst in the tedious process of searching through large geospatial image libraries in search of potential nuclear proliferation activities. As a part of this process, an automated Graphical User Interface (GUI) based tool was developed to label and classify various semantic structures using primitive labels (buildings, roads, water bodies etc.), and henceforth generate ground truth. A dynamic and optimal tessellation technique called quad-tree partitioning was adapted, implemented and evaluated.
The wood-core imaging application was developed for the bio-ethanol project. The project focused on the development of image processing methods for the analysis of wood-core images. The complex tree ring structures of wood-core images necessitated the development of advanced, non-stationary image processing methods that could detect annular growth rings. To address these issues, this project explored the use of Savitzky-Golay (S-G) filtering in the spatial domain, improved frequency-modulation methods and image adaptive filterbanks.
The biomedical imaging project aimed at the development of a Computer Aided Radiograph Reader System (Computer Aided Detection) that could classify the International Labor Organization’s standard digitized images automatically and reduce the inter- and intra-reader variability. Its goal was to determine, for the entire lung, the level of opacity profusion of an interstitial lung disease called pneumoconiosis on a 12-point rating scheme. Active Shape Modeling, a statistical model, was employed as a technique for inter-rib segmentation. The resulting automatic classifier system could potentially optimize the time and money spent by the radiologists in manual classification.
GPU-Accelerated Molecular Dynamics Models for Studying Soft Matter Systems
Monday, Sept. 27, 2:30–3:30 pm, 50F-1647
Carolyn Phillips, University of Michigan
Many properties of materials are determined by micro- to nanoscale features. And materials at the scale of tens to hundreds of atoms evidence radically different physical properties than their bulk counterparts. Already, a new class of nano-engineered materials is being developed whose properties are carefully controlled at the molecular level. In the future, nano-engineered materials will not be constructed top-down, component by component, but, rather, self-assembled from the bottom-up through a careful choice of specifically designed nanoscale components. Even coarse-grained models of nanoparticle self-assembly can be computationally expensive. To observe the critical features, many systems require thousands of particles and tens of millions of time steps, and have large parameter spaces to explore.
Developed in my research group, the GPU accelerated HOOMD-Blue, Highly Optimized Object-oriented Many-particle Dynamics—Blue Edition, performs general-purpose particle dynamics simulations on a single GPU-enabled workstation, but achieves the performance of dozens of processor cores. In my research in soft matter self-assembly, I use GPU-accelerated molecular dynamics to rapidly generate and explore phase diagrams of polymer-tethered nanoparticles. I am investigating new computationally efficient models for capturing the self-assembly behavior of heterogeneous and anisotropic nanoscale particles found in soft matter systems. These particles will be the building blocks of novel materials designed to self-assemble spontaneously.
Quantum Information Processing with Superconducting Circuits
Monday, Sept. 27, 4:30 pm, 1 LeConte Hall, UC Berkeley
Irfan Siddiqi, UC Berkeley
Over the past 15 years, there has been growing excitement about storing, processing, and transmitting information using physical systems that exhibit quantum mechanical phenomena. Using superconducting electronics, it is possible to realize coherent nonlinear oscillators with vanishing internal dissipation—the basic building block of “artificial” atoms which can be used as quantum bits and quantum noise limited amplifiers for measurement. An introduction to the field along with recent results on single shot quantum measurement and novel architectures, including weak link based Josephson junction and hybrid solid state circuits, will be presented.
Par Lab Seminar: How’s the Parallel Computing Revolution Going? Towards Parallel Scalable Virtual Machine Services
Tuesday, Sept. 28, 10:00–11:00 am, 438 Soda Hall, UC Berkeley
Kathryn McKinley, University of Texas at Austin
To answer this question, we first overview trends in scalability and power efficiency of modern applications executing on a cross section of current and past processors. These results show that many managed applications are not yet scalable and consume disproportionate amounts of power. Achieving performance scaling of parallel applications written in managed languages must start with scalable virtual machine (VM) services. Because the VM schedules, monitors, profiles, compiles, optimizes, garbage collects, and executes along with the application, it must be scalable and has a unique opportunity to enhance application scalability. This talk uses results from my research group on concurrent dynamic analysis to illustrate the potential and challenges of obtaining scalability on modern chip multiprocessor hardware.
We conclude with an overview of challenges for scalable virtual machines and suggested future directions.
Invasion of the Digital World in Art, Entertainment, Social Media and More
Wednesday, Sept. 29, 12:00 pm, Banatao Aud., Sutardja Dai Hall, UC Berkeley
Jean Paul Jacob, Special Advisor to CITRIS
Live broadcast at mms://media.citris.berkeley.edu/webcast
In his talk, Dr. Jacob will take you on a guided tour of what your life could be in the short- and long-term future. We are increasingly living in a physical world augmented by the arrival of many digital worlds. When you watch a movie, you don’t know when the real actors are shown on the screen versus wire-framed computer-generated clones. When Indiana Jones is surrounded by snakes and serpents, the only risk for him is ... a computer crash. Robots performing delicate surgeries, intelligent mirrors, user-designed products, flying cars, 3D virtual worlds like Second Life, ink on a dead tree — also known as books and newspapers — replaced by e-ink, etc., are all part of the continuously augmented physical world. What else is “out there” ready to invade our lives? Jean Paul will show examples of this invasion of digital/virtual worlds in the arts and entertainment, sensors, social networks, healthcare and medicine, 3D web and Internet, and more. Please join him for this exciting tour of your future!
Link of the Week: Can Grazing Animals Save the Planet?
Is bare ground, rather than burning fossil fuels, the main cause of carbon buildup in the atmosphere? According to advocates of the agricultural model known as “holistic management,” global soil depletion and excess atmospheric CO2 are flip sides of the same problem, and both can be resolved by the same solution: livestock — not cattle crammed into feedlots, but rather “planned grazing,” with herds of well-managed grazing animals nibbling on native grasses and roaming from place to place to elude predators and seek fresh pasture.
Ian Mitchell-Innes, a South African rancher and trainer in holistic management, says, “If we improve 50 percent of the world’s agricultural land we could sequester enough carbon in the soil to bring atmospheric CO2 back to pre-industrial levels in five years.”
According to Abe Collins of New Soil Security, Inc., a 1 percent increase in soil carbon on 5 billion acres of agricultural land would not only relieve our atmosphere of some 200 billion tons of CO2 — the equivalent of 100 parts per million — but also enhance food production. And because covered, carbon-rich soil infiltrates and holds significantly more water than its dried-out counterpart, it aids stream and river flow and protects against flooding and drought.
Read more in “Roving Herds of Grazing Climate Helpers” by Judith D. Schwartz in Miller-McCune Magazine.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 7,000-plus scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are Department of Energy Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.