A-Z Index | Phone Book | Careers

The 20th Century Reanalysis Project: A Climate Time-Machine

New Dataset Provides Understanding of Earth's Past and Future Climate

January 25, 2011

Berkeley Lab Contact: Jon Bashor, [email protected], 510-486-5849
Wiley-Blackwell Contact: Ben Norman, [email protected],+44(0)1243-770-375
Science Contact: Jeffrey Whitaker, [email protected], (303) 497-6313

From the hurricane that smashed into New York in 1938 to the impact of the Krakatoa eruption of 1883, the late 19th and 20th centuries are rich with examples of extreme weather. Now an international team of climatologists have created a comprehensive reanalysis of all global weather events from 1871 to the present day, and from the earth's surface to the jet stream level.

ASCR Discovery - Old Weather, Future Climate

The 20th Century Reanalysis Project, outlined in the Quarterly Journal of the Royal Meteorological Society, not only allows researchers to understand the long-term impact of extreme weather, but provides key historical comparisons for our own changing climate.

"Producing this huge dataset required an international effort to collate historical observations and recordings from sources as diverse as 19th century sea captains, turn of the century explorers and medical doctors, all pieced together using some of the world's most powerful supercomputers at the US Department of Energy's National Energy Research Scientific Computing Center in California and the Oak Ridge Leadership Computing Facility in Tennessee," said lead author Dr Gil Compo.

"The resulting weather maps, called reanalyses, provide a much longer record of past weather variability than is currently available to compare present and projected weather variability in a warming climate. They also provide valuable insight into extreme weather and climate events that were historically important."

Dr. Compo leads the 20th Century Reanalysis Project (20CR) at the National Oceanic and Atmospheric Administration (NOAA) Earth System Research Laboratory (ESRL) and the Cooperative Institute for Research in Environmental Sciences (a joint project of NOAA and the University of Colorado) Climate Diagnostics Center with colleagues Dr. Jeffrey Whitaker of NOAA, Dr. Prashant Sardeshmukh of NOAA and the CIRES Climate Diagnostics Center, and Dr. Rob Allan of the United Kingdom Met Office Hadley Centre. The 20CR is produced in partnership with the Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative, the Global Climate Observing System (GCOS), and 36 other international organisations.

In 2007, Compo was awarded 2 million hours of supercomputing time at National Energy Research Scientific Computing Center (NERSC) under the Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE). The INCITE program was created to support projects that not only require large-scale and intensive use of supercomputers but also promise to deliver a significant advance in science and engineering. In 2008 Compo received an even larger INCITE allocation to continue his successful efforts at NERSC, and used a total of 10 million supercomputing hours at the facility.

"The 20th Century Reanalysis project is a great example of how dedicated supercomputing time and technical support can significantly advance research in an area that has significant impact on our lives," said NERSC Director Kathy Yelick. "In addition to furthering his own research, Gil Compo's team has also made their data available to the wider climate research community, giving his work even greater reach."


"The new dataset will allow climate scientists to put current weather extremes in a historical perspective and determine how extremes are changing,"

—Gil Compo,
Lead of The 20th Century Reanalysis Project


In 2009, DOE awarded Compo's INCITE project 1.1 million hours at the Oak Ridge Leadership Computing Facility. However, he also continued computing at NERSC with allocations from DOE's Office of Biological and Environmental Research and NERSC's Initiative for Scientific Exploration. Over the last four years, Compo has used over 20 million processing hours at NERSC, primarily on the Cray XT4 system and the earlier IBM SP system.

By using historical climate data to understand current weather patterns the 20CR team, which includes 27 international scientists, are building on the work of their meteorological forebears such as the U.S. Historical Weather Map Series produced by the U.S. Weather Bureau to better understand weather events preceding World War II. However, the 20CR is the first project of its kind to span a full century.

"A preliminary version of this project (20CRv1, Compo et al., 2008) spanned the period 1908 to 1958," said Compo. "In this second and complete version (20CRv2), the global atmospheric fields for 1871 to 2008 have been generated.  We hope, as Wexler and Tepper of the US Weather Bureau said in 1947, that this project can ‘breathe life into a mass of inert data’ while providing an indispensable aid to future research."

The 20CR dataset provides the first long-term estimates of global tropospheric variability, weather maps from the Earth's surface to the level of the jet-stream, and of their time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions.

"The new dataset will allow climate scientists to put current weather extremes in a historical perspective and determine how extremes are changing," said Compo. "Just how extreme is the recent European cold wave, for example, or the blizzard in the US Northeast?"

The 20CR dataset also gives a new insight into the weather events that may have misinformed early-century policy decisions, such as the wet period in central North America that led to overestimates of rainfall and over-allocation of water resources in the Colorado River basin in the years before the US Dust Bowl of the 1930's.

"This reanalysis data will enable climate scientists to rigorously evaluate past climate variations compared to climate model simulations, which is critical for building confidence in model projections of regional changes and high-impact, extreme events," concluded Compo. "We hope that this 138 year reanalysis data will enable climate researchers to better address issues such as the range of natural variability of extreme events, including floods, droughts, extratropical cyclones, and cold waves."

Read more about Compo's work at NERSC. This paper is published in the Quarterly Journal of the Royal Meteorological Society, which is published on behalf of the Royal Meteorological Society. This text was adapted from a Wiley-Blackwell press release.


About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery, and researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.