A-Z Index | Phone Book | Careers

InTheLoop | 06.11.2012

June 11, 2012

Using NERSC Systems, Physicists Close In on a Rare-Particle Decay Process

With help from supercomputers at NERSC, the Enriched Xenon Observatory experiment (EXO-200) has placed the most stringent constraints yet on the nature of a process called neutrinoless double beta decay. In doing so, the physicists have narrowed down the range of possible masses for the neutrino—a tiny uncharged particle that rarely interacts with anything, passing right through people and planets at nearly the speed of light. Read more.


Computer Model Pinpoints Prime Materials for Efficient Carbon Capture

With the collaboration of Richard Martin, Chris Rycroft, and Maciej Haranczyk of Berkeley Lab’s Computational Research Division, who used the Dirac GPU cluster at NERSC, Berend Smit of the Materials Sciences Division led the multi-institutional development of a computer model that can screen solid materials such as zeolites and metal oxide frameworks (MOFs) for their ability to cost-effectively capture carbon emissions from fossil fuel-burning power plants. Current carbon-capture technologies would use about one-third of the energy generated by plants, called “parasitic energy,” which would substantially drive up the price of electricity. The new model shows that the parasitic energy costs of carbon capture could be reduced by 30 percent with the use of more efficient materials. Read more.


ESnet Celebrates World IPv6 Launch

On Wednesday, June 6, a new Internet began. The change was subtle, but a new address system is making the web faster, and enabling us to do things that were not possible until now. The new system, called IPv6, is good for 340 trillion, trillion, trillion addresses. The next time you visit Google, Amazon, Facebook, and other major sites, they will be using it. Read more.

There was an incorrect link in last week’s InTheLoop for KGO-TV’s interview with Eli Dart on IPv6; go here to see the interview.


This Week’s Computing Sciences Seminars

Visualization and Computation for Transportation Modeling
Tuesday, June 12, 10:00 am–12:00 pm, 50B-4205
Joel VanderWerf, Gabriel Gomes, Anthony Patire, Bill Sappington, and Saneesh Apte, Institute of Transportation Studies, University of California, Berkeley

California PATH is beginning a five-year project to revolutionize transportation analysis, planning, and management. New research on macroscopic modeling, estimation, and simulation promises a quantum jump in prediction accuracy and computational efficiency. Our goal is to build tools around these models and put them into the hands of system operators as well as researchers, so that they can compare alternative control strategies in real time, evaluate the offering of incentives for travelers to shift their mode, route, or time, correlate traffic flow with other system perspectives such as emissions and safety, and plan longer-term system changes.

The resources of this project (called CC-VIP: Connected Corridors: Vehicles, Infrastructure, People) are substantial, bringing together two mature projects under the guidance of four tenured UCB professors. The team also includes about 15 model developers (GSRs, postdocs, and research staff), a core software team of about 5, including a full-time Oracle DBA, and a small but growing visualization group.

A key difficulty facing us is the size and complexity of data all along the pipeline from noisy sensor signals, to ensemble Kalman filter estimation, to PDE-based simulation, and finally to result sets. Data sets are multidimensional: not just space and time, but also data source, scenario alternative, model alternative, and best/worst case. Data sets may include confidence levels or error bounds. Scope ranges from system-wide origin-destination patterns to link-level flow, speed, and density to transit vehicle trajectories in congested arterial traffic. Time scales vary from seconds to days to years.

For visualizing this complexity in our models and data, we’re not looking for a better time-series plot or heat map generator or Google map overlay tool, but something with more functionality: visualization that allows navigating around in many dimensions, highlighting patterns or structures, correlating between data sets, and understanding uncertainty.

As representatives of the modeling and software teams of the CC-VIP project, we would like to explore collaboration with LBNL and SDAV on both visualization and HPC, including sharing of software tools and platforms, and staffing.

About us:
http://traffic.berkeley.edu
http://gateway.path.berkeley.edu/topl
http://www.path.berkeley.edu

Related projects:
http://amplab.cs.berkeley.edu
https://pems.eecs.berkeley.edu

MATLAB Seminar: Data Analysis and Visualization with MATLAB
Tuesday, June 12, 10:00 am–12:00 pm, 50 Auditorium
Saket Kharsikar, MATLAB Application Engineer

During this technical presentation, we will introduce specific examples to demonstrate how to acquire, analyze and visualize data through mathematical, statistical and engineering functions that support common engineering operations. We will also provide an overview of the MATLAB technical computing environment, including desktop tools for editing and debugging code, publishing your work, surface fitting, and creating graphical user interfaces (GUIs).

Highlights include:

  • Importing data and images into MATLAB
  • Performing statistical analysis and curve fitting
  • Automating analysis via MATLAB code generation
  • Developing algorithms and applications
  • Building GUIs and generating reports
  • Creating standalone executables

MATLAB Seminar: Parallel and GPU Computing with MATLAB
Tuesday, June 12, 2:00–4:00 pm, 50 Auditorium
Saket Kharsikar, MATLAB Application Engineer

In this session you will learn how to solve computationally and data-intensive problems using multicore processors, GPUs, and computer clusters. We will introduce you to high-level programming constructs that allow you to parallelize MATLAB applications and run them on multiple processors. We will show you how to overcome the memory limits of your desktop computer by distributing your data on a large scale computing resource, such as a cluster. We will also demonstrate how to take advantage of GPUs to speed up computations without low-level programming.

Highlights include:

  • Toolboxes with built-in support for parallel computing
  • Creating parallel applications to speed up independent tasks
  • Scaling up to computer clusters, grid environments or clouds
  • Employing GPUs to speed up your computations

NERSC Brown Bag Seminar: Lorenz: Bringing HPC to the Web
Friday, June 15, 12:00–1:00 pm, OSF 943-238
Joel Martinez, Lawrence Livermore National Laboratory

Lorenz, a tool from LLNL, is a web based portal designed to support user interaction within a high performance computing environment. Lorenz builds on existing web paradigms using advanced JavaScript techniques and a RESTful web service API to make HPC easier and more accessible. Additionally, the Lorenz team has implemented the Simulation Input Markup Language (SIML) and interpreter, which allows for the development of robust simulation interfaces through the use of only HTML.

If you have web oriented summer interns please invite them as well.


Link of the Week: Science Is Not About Certainty

Carlo Rovelli, a theoretical physicist and one of the founders of the Loop Quantum Gravity theory, is also a formidable philosopher of science. In a video conversation (with transcript) with the online magazine Edge, “Science Is Not About Certainty: A Philosophy of Physics,” Rovelli attributes theoretical physics’ 30-year-slump (no big success since the Standard Model) to a philosophical misunderstanding of science, which results in unproductive methodologies.

Rovelli sees the two major research approaches to quantum gravity—strings and loops—as the embodiments of two different philosophies of science. String theorists embrace the notion that scientific progress happens when new theories overturn old theories. But Rovelli contends that isn’t what really happens: major progress happens when we take existing theories seriously and allow their contradictions and loose ends to challenge and change how we think about the world. Scientific progress is not about data or theories, he says, it’s about changing conceptual frameworks. Loop quantum gravity, for example, is very conservative, because it’s based on what we know, but it’s also totally radical, because it suggests that time has no fundamental reality—it’s an illusion.

Here are a few excerpts from the conversation:

Science is not about certainty. Science is about finding the most reliable way of thinking, at the present level of knowledge. Science is extremely reliable; it’s not certain. In fact, not only it’s not certain, but it’s the lack of certainty that grounds it. Scientific ideas are credible not because they are sure, but because they are the ones that have survived all the possible past critiques, and they are the most credible because they were put on the table for everybody’s criticism.

The very expression “scientifically proven” is a contradiction in terms. There is nothing that is scientifically proven. The core of science is the deep awareness that we have wrong ideas, we have prejudices….

[S]cience is not about the data. The empirical content of scientific theory is not what is relevant. The data serves to suggest the theory, to confirm the theory, to disconfirm the theory, to prove the theory wrong. But these are the tools that we use. What interests us is the content of the theory. What interests us is what the theory says about the world….

Science is a continuous challenge of common sense, and the core of science is not certainty, it’s continuous uncertainty….

String theory’s a big guesswork. I think physics has never been a guesswork; it has been a way of unlearning how to think about something, and learning about how to think a little bit different by reading the novelty into the details of what we already know.



About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.