A-Z Index | Phone Book | Careers



NERSC Uses Stimulus Funds to Overcome Software Challenges for Scientific Computing

October 30, 2009

A "multi-core" revolution is occurring in computer chip technology. No longer able to sustain the previous growth period where processor speed was continually increasing, chip manufacturers are instead producing multi-core architectures that pack increasing numbers of cores onto the chip. In the arena of high performance scientific computing, this revolution is forcing programmers to rethink the basic models of algorithm development, as well as parallel programming from both the language and parallel decomposition process. Read More »


Berkeley Lab Researchers Prepare U.S. Climate Community for 100-Gigabit Data Transfers

October 30, 2009

As researchers around the world tackle the issue of global climate change, they are both generating and sharing increasingly large amounts of data. This increased collaboration helps climate scientists better understand what is happening and evaluate the effectiveness of possible mitigations. Read More »

B-ISICLES (Berkeley Ice Sheet Initiative for Climate at Extreme Scales) Project to Improve Accuracy of Ice Sheet Models

October 30, 2009

One of the most-cited examples of global climate change is retreating ice sheets in Antarctica and Greenland. But the detail of how fast they are melting is a mystery that may be solved with a new generation of computer simulations. Read More »


It's Not Too Late

October 27, 2009

The threat of global warming can still be greatly diminished if nations cut emissions of heat-trapping greenhouse gases by 70 percent this century, according to a study led by scientists at the National Center for Atmospheric Research (NCAR). While global temperatures would rise, the most dangerous potential aspects of climate change, including massive losses of Arctic sea ice and permafrost and significant sea level rise, could be partially avoided. Read More »

DOE to Explore Scientific Cloud Computing at Argonne, Lawrence Berkeley National Laboratories

October 14, 2009

ARGONNE, Ill., and BERKELEY, Calif. (Oct. 14, 2009) – Cloud computing is gaining traction in the commercial world, but can such an approach also meet the computing and data storage demands of the nation's scientific community? A new program funded by the American Recovery and Reinvestment Act through the U.S. Department of Energy (DOE) will examine cloud computing as a cost-effective and energy-efficient computing paradigm for scientists to accelerate discoveries in a variety of disciplines, including analysis of scientific data sets in biology, climate change and physics. Read More »


Lasers without mirrors, designed by supercomputer

October 14, 2009

Sometimes it takes a big machine to understand the tiniest details. That’s the case with free electron lasers (FELs). The powerful X-rays they generate can probe matter directly at the level of atomic interactions and chemical-bond formation, letting scientists observe such phenomena as chemical reactions in trace elements, electric charges in photosynthesis and the structure of microscopic machines. FELs have the potential to address a host of research challenges in physics, chemistry and material and biological sciences. Read More »


Juan Meza Named One of Hispanic Business Magazine's "100 Influentials"

October 6, 2009

Juan Meza, head of the High Performance Computing Research Department in Berkeley Lab's Computational Research Division, has been named to Hispanic Business magazine's annual list of 100 influential Hispanics. The list, published in the October issue, includes Hispanics who play leading roles in politics, business, science, information technology, health care, education, the media, and other areas. Meza and Puerto Rican astronaut Joseph Acaba are the only two scientists on this year's list. Read More »


NERSC Contributes to EMGeo Mapping Software for Finding Hidden Oil and Gas Reserves

September 30, 2009

As the world's demand for energy increases, billions of dollars are dedicated to the search for deep-water hydrocarbon reservoirs each year. Although seismic imaging methods have long been used to collect valuable information on geological structures bearing hydrocarbon deposits, they have not proven effective in discriminating different types of reservoir fluids, such as brines, oil, and gas. Because of this inability to discriminate, over time huge financial losses result from drilling dry holes—up to 100 million dollars per each unsuccessful drilling. Meanwhile significant hydrocarbon reservoirs remain undiscovered. Read More »


Reanalysis Project targets once-and-future weather

September 29, 2009

Alone near the South Pole in 1934, Admiral Richard Byrd documented his ice-bound despair. He also dutifully recorded the surface air pressure every hour. Read More »


Berkeley Lab Scientists' Computer Code Gives Astrophysicists First Full Simulation of Star's Final Hours

September 22, 2009

BERKELEY, CA — The precise conditions inside a white dwarf star in the hours leading up to its explosive end as a Type Ia supernova are one of the mysteries confronting astrophysicists studying these massive stellar explosions. But now, a team of researchers, composed of three applied mathematicians at the U.S. Department of Energy's (DOE) Lawrence Berkeley National Laboratory and two astrophysicists, has created the first full-star simulation of the hours preceding the largest thermonuclear explosions in the universe. Read More »