A-Z Index | Phone Book | Careers

InTheLoop | 05.02.2011

May 2, 2011

Record-Setting Antimatter Particle Detected with NERSC Help

Eighteen examples of the heaviest antiparticle ever found, the nucleus of antihelium-4, have been made in the STAR experiment at the Relativistic Heavy Ion Collider at Brookhaven National Laboratory. The finding wasn’t unexpected, but it is a milestone for scientists exploring a fundamental puzzle of physics: Why is there any matter at all? The next possible heavyweight antimatter particle will be thousands of times harder to detect, so this record is likely to stand for years to come. Read more.


NERSC Helps Solve LED Efficiency Puzzle

Despite being cool, ultra-efficient and long-lasting, the light-emitting diode (LED) remains an impractical alternative for general lighting due to a problem called “efficiency droop.” New findings from simulations carried out at the National Energy Research Scientific Computing Center (NERSC) have unearthed droop’s elusive cause, researchers say, paving the way for wider LED use. Read more.


New Recruiting and Hiring Process and Careers Website

Effective today, Human Resources has implemented a streamlined, simplified recruiting and hiring process; transitioned to Taleo, a best-in-class, cloud-based recruiting and hiring system; and launched a new Careers Website. The goal of these improvements is to support hiring managers in more effectively sourcing and hiring exceptional talent. Please contact Jeff Todd (JLTodd@lbl.gov) for more information and a Taleo demonstration.


Memorial Celebration of Stu Loken’s Life on May 22

As many of you know, Stu Loken passed away on February 19, 2011. Stu was a Senior Scientist in the Physics Division Berkeley Lab, having worked at Berkeley Lab for 37 years. More than half that time was devoted to service in Lab management.

As Director of the Information and Computing Sciences Division (forerunner of the multidivisional Computing Sciences organization) from 1988 to 2000, Stu created both the Technical and Electronic Information Department, forerunner of today’s Creative Services Office, and the Computing Infrastructure Support Department, forerunner of the Information Technology Division.

Perhaps most significantly, he is credited with having laid the groundwork for bringing NERSC to Berkeley Lab, and for having helped create the Department of Energy’s ESnet and bringing its headquarters here. (Read a brief career summary here.)

Geanie Loken and her family would like to invite you to attend a “Celebration of Stu Loken’s Life” from 2:00 to 5:00 pm on Sunday, May 22, 2011 at the Claremont Resort Hotel. Please RSVP here.


i4Science Lecture Series Sponsors Eight Talks on May 3, 5 at CITRIS

The Berkeley Computational Science and Engineering program, developed jointly by Berkeley Lab Computing Sciences, UC Berkeley, and the University’s Center for Information Technology Research in the Interest of Society (CITRIS), is sponsoring a series of presentations on Tuesday, May 3, and Thursday, May 5. The talks are open to the public and will be held in the Banatao Auditorium in Sutardja Dai Hall on the Cal campus.

CSE is a rapidly growing multidisciplinary field that encompasses real-world complex applications (scientific, engineering, social, economic, policy), computational mathematics, and computer science and engineering. High performance computing, large-scale simulations and scientific applications all play a central role in CSE. i4Science will focus mainly on a smaller subset of CSE applications that within three to five years would be scalable from thousands to millions of processors and from tera- to exascale computing using emerging computing technologies.

Here is the i4Science lecture schedule:

Tuesday, May 3
Noon–1 pm: Eng Lim Goh, senior vice president and chief technology officer at SGI, will discuss “Computer Modeling of Natural and Synthetic Systems.”

Thursday, May 5
Noon–1:15 pm: New Engineering Trends and Data Challenges

  • 12:15–12:45 pm: Prof. Jose Carmena, co-director of the Center for Neural Engineering and Prostheses at UC Berkeley and UCSF and professor in the Brain-Machine Interface Systems Laboratory at UCB, will discuss “New Trends in Neural Engineering and Prostheses.”
  • 12:45–1:15 pm: Juan Meza, Acting Director of LBNL’s Computational Research Division (CRD), will discuss “Data Challenges in Energy and Environmental Applications.”

1:15–2:00 pm: Jim Spohrer, director of IBM University Programs Worldwide, will discuss “The Future of ICT to Build a Smarter Planet.”

2:00–3:00 pm: Computation and Data-Driven Modeling

  • 2:00–2:30 pm: Bahram Parvin, a scientist in LBNL’s Life Sciences Division, will discuss “Computational Histopathology for the Cancer Genome Atlas.”
  • 2:30-3:00 pm: Harish Bhat, assistant professor of applied mathematics at UC Merced, will discuss “Data-Driven Modeling and Prediction of Startup Company Exits.”

3:00–4:00 pm: Trends in NeuroScience

  • 3:00–3:30 pm: Bruno Olshausen, director of the Redwood Center for Theoretical Neuroscience at UC Berkeley, will discuss “Finding Patterns of Activity in Large-Scale Neural Recordings.”
  • 3:30–4:00 pm: Fritz Sommer, associate adjunct professor at the Redwood Center for Theoretical Neuroscience and faculty member at the University of Ulm, Germany, will discuss “Online Repositories for Neuroscience Enable Concerted Efforts to Understand the Brain.”


CITRIS Sponsors Visualization Technologies Conference on May 26

CITRIS is sponsoring a one-day conference, From Data Collection to Display: How Visualization Transforms Industries, from 9:00 am to 8:00 pm on Thursday, May 26, in the Banatao Auditorium in Sutardja Dai Hall on the Cal campus.

The visualization of data is one of our most powerful tools. It enables a remarkable degree of data compression and lets us see important relationships or interconnections that we might otherwise miss. Visualization is relevant to every sector of the economy.

This one-day event enables you to discuss aspects of the visualization value chain, from data collection to user-friendly interfaces, with leading industry experts and promising startups. We want to provide a forum where experts from both industrial and academic communities address scientific, technology, and service related issues on visualization.

The conference focuses on the driving need of industry to collect, transmit, and analyze huge amounts of data, and covers various aspects of video communications, including generating of data, communications, immersive multimedia displays, and industry cases. It addresses a broad range of applications and services, including video processing and delivery, service issues, and perspectives on areas of future development. The event features speakers from organizations leading this field, including Google, Oracle, UC Berkeley, and the Heinrich-Hertz-Institute.

Online registration is $99.


This Week’s Computing Sciences Seminars

HaLoop: Efficient Iterative Data Processing on Large Clusters
Monday, May 2, 11:00 am–12:00 pm, 465H Soda Hall, UC Berkeley
Yingyi Bin, UC Irvine

The growing demand for large-scale data mining and data analysis applications has led both industry and academia to design new types of highly scalable data-intensive computing platforms. MapReduce has enjoyed particular success. However, MapReduce lacks built-in support for iterative programs, which arise naturally in many applications including data mining, web ranking, graph analysis, and model fitting. In this talk, I will present HaLoop, a modified version of the Hadoop MapReduce framework that is designed to serve these applications. HaLoop allows iterative applications to be assembled from existing Hadoop programs without modification, and significantly improves their efficiency by providing inter-iteration caching mechanisms and a loop-aware scheduler to exploit these caches. HaLoop retains the fault-tolerance properties of MapReduce through automatic cache-recovery and task re-execution. Evaluation results on three applications confirms our design intuitions.

The Ability of NCAR’s Community Atmosphere Model to Simulate Idealized Tropical Cyclones
Monday, May 2, 3:30–4:30 pm, 50F-1647
Kevin A. Reed, University of Michigan

Using General Circulation Models (GCMs) for tropical cyclone studies is difficult due to the relatively small size of the storms, the intense convection and a host of large-scale small-scale interactions. These are mostly unresolved at typical GCM resolutions of about 50-100 km, and still challenged at high resolutions between 12-30 km. Nevertheless, high-resolution GCMs are becoming a tool of choice to evaluate tropical cyclones in current and future climate conditions. Therefore, the physical and dynamical components of a GCM need to be carefully evaluated to assess their fidelity for tropical cyclone studies.

We develop and implement an idealized tropical cyclone test case for high-resolution GCMs in aqua-planet mode with constant sea surface temperatures. The initial conditions are based on an initial vortex seed that is in gradient-wind and hydrostatic balance and intensifies over a 10-day period. The initialization of the vortex is built upon prescribed 3D moisture, pressure, temperature and velocity fields that are embedded into tropical environmental conditions. The analytic technique can easily be implemented on any GCM computational grid.

The impact of the model physics package on the evolution of the tropical cyclone is assessed. In particular, we investigate different physics packages within NCAR’s hydrostatic Community Atmosphere Model CAM, including CAM 3.1, CAM 4 and CAM 5 physics. The significance of small variations in the initial conditions and model physical constants on the evolution of the tropical cyclone are also assessed.  In addition, we have developed and tested a simple physics suite that only incorporates surface fluxes, turbulence and large-scale precipitation as the driving mechanisms. Similarly, the impact of the dynamical core (the resolved fluid flow component) on the evolution of the tropical cyclone is assessed. In particular, we investigate four dynamical cores (FV, EUL, SLD, HOMME) that are part of NCAR’s CAM. The research isolates the impact of the physics schemes, numerical schemes and uncertainties on the evolution of the cyclone in CAM.

Navigating NERSC File Systems
Tuesday, May 3, 5:00–6:00 pm, OSF 943-238
Please register to participate online
David Turner, NERSC User Services Group

NERSC hosts a number of file storage systems, each with its unique characteristics. In this tutorial we will describe best uses for each file system, their data retention policies, how to move data among them, and how to share data with colleagues.

Self-Optimizing Microprocessors: A Machine Learning Approach
Wednesday, May 4, 11:00 am–12:00 pm, HP Auditorium, 306 Soda Hall, UC Berkeley
Jose Martinez, Cornell University

As each technology generation brings additional transistors, the computer industry hopes to convert these into performance growth by stamping out a greater number of cores on a die. On the one hand, in many environments, that seems like a lot of hope. On the other hand, architecture researchers have grown almost allergic to “complex” alternatives, which history has shown can quickly fall off the cliff of diminishing returns.

A fundamental hurdle to bettering architectures may lie in the very perception of complexity. Many past and current architectures are indeed complex, but often in an unsophisticated way. Hardware designs tend to be very human-centric: decisions are primarily taken by human beings at design time, based almost exclusively on their experience and ability. Unsurprisingly, the operation of the adopted mechanisms is typically confined to what a human can readily understand, and the end product often falls short in potentially important capabilities, such as the ability to plan ahead, to act successfully in previously unseen states, or to improve automatically with experience.

At Cornell, we are investigating architectures that possess and exploit such capabilities, primarily by leveraging machine learning technology. Our approach encourages the designer to focus more on the system variables and constraints that may play a role in realizing a performance objective, rather than formulating exactly how the hardware should accomplish such an objective. In this way, we have, for example, devised self-optimizing memory controllers that automatically adapt to changing software demands, delivering higher performance in ways that may be unintuitive to a human. Our ultimate goal is better computers through a more productive use of human ingenuity.

LAPACK Seminar: Hybrid Parallel Ordering Method for a Parallelized Multiplicative Schwarz Smoother in a Multigrid Solver for Time-Harmonic Electromagnetic Field Problems
Wednesday, May 4, 11:10 am–12:00 pm, 380 Soda Hall, UC Berkeley
Takeshi Iwashita, Academic Center for Computing and Media Studies, Kyoto University

This research investigates large-scale parallel time-harmonic electromagnetic field analysis based on the finite element method. The parallel geometric multigrid preconditioned iterative solver for the resulting linear system was developed on a cluster of shared memory parallel computers. We propose a hybrid parallel ordering method for the parallelization of a multiplicative Schwarz smoother, which is a key component of the multigrid solver for electromagnetic field analysis. The method, using domain decomposition ordering for multi-process parallelism and introducing block multi-color ordering for multi-thread parallel processing, attains a high convergence rate with a small number of message passing interface communications and thread synchronizations. The numerical test confirms that the proposed method attains a solver performance more than twice as good as the method based on multi-color ordering. Furthermore, an approximately 800 million degrees of freedom problem is successfully solved on 256 quad-core processors.

OSF HPC Seminar Series: LDAP Infrastructure Upgrade
Thursday, May 5, 12:00–1:30 pm, OSF 943-238
Tom Davis, NERSC Outreach, Software, and Programming Group

Tom will discuss the old LDAP service, the new LDAP service, how to use the new LDAP service, how to see what is going on with the new LDAP service, and how to check for problems. The date for the release of the new LDAP service will be announced.


Link of the Week: The Internet in Society: Empowering or Censoring Citizens?

Does the internet actually inhibit, not encourage democracy? In a new RSA animation adapted from a talk given in 2009, Evgeny Morozov presents an alternative take on “cyber-utopianism”—the seductive idea that the internet plays a largely emancipatory role in global politics. Exposing some idealistic myths about freedom and technology (during Iran’s “Twitter revolution,” fewer than 20,000 Twitter users actually took part), Morozov argues for some realism about the actual uses and abuses of the internet. See video.



About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.