LBNL to Highlight Leadership in Computational Science in Presentations, Demos at SC04
October 27, 2004
What kind of scientific breakthroughs can researchers achieve with one million dedicated processor hours on one of the worlds fastest supercomputers? Two million hours? Thanks to a special Department of Energy program, three research groups studying turbulence, astrophysics and chemistry were awarded a total of nearly 5 million hours on the 6,652-processor IBM supercomputer at the National Energy Research Scientific Computing Center (NERSC).
Leaders of the three projects, supported under the INCITE, or Innovative and Novel Computational Impact on Theory and Experiment, program will present their results at the Lawrence Berkeley National Laboratory (LBNL) booth (no. 139) starting at 1 p.m. Tuesday, Nov. 9, at the SC2004 conference in Pittsburgh, Pa.
At the conference, Berkeley Lab will be hosting three days of talks by leading experts in computational science, cyber security, networking, and tools for advancing scientific computing. LBNL will also demonstrate capabilities in scientific visualization, distributed computing, cyber security, scientific data management and cluster performance. Demos and talks will feature results from NERSC, which is located at Berkeley Lab.
“We are extremely proud that the principal investigators of the three INCITE projects have agreed to present their results based on computations performed at NERSC,” said NERSC Division Director Horst Simon. “These presentations, in addition to the other talks in our booth, reflect NERSC’s position as DOE’s leading center for unclassified computational science.”
Visitors to the LBNL booth can also enter a daily drawing to win books signed by LBNL authors: David Bailey’s “Mathematics by Experiment,” Grigory Barenblatt’s “Scaling,” and “Intrusion Detection and Prevention” co-authored by Eugene Schultz and Jim Mellander.
Here is the schedule of talks in the LBNL booth:
Tuesday, November 9
- 10:45 a.m. – “SciDAC Scientific Data Management Center: Infrastructure and Results,” Arie Shoshani, LBNL
- 11:30 a.m. ‑ “Science-Driven Visualization Research Challenges,” Wes Bethel, Scientific Visualization Group, LBNL\
- 1 p.m. ‑ “Innovative and Novel Computational Impact on Theory and Experiment (INCITE) Introduction,” Horst Simon, LBNL
- 1:05 p.m. ‑ “Quantum Monte Carlo Study of Photoprotection via Carotenoids in Photosynthetic Centers,” William Lester and Alan Aspuru-Guzik, UC Berkeley College of Chemistry1:45 p.m. ‑ “Thermonuclear Supernovae: Stellar Explosions in Three Dimensions,” Tomasz Plewa and Timur Linde, University of Chicago
- 2:30 p.m. ‑ “Fluid Turbulence and Mixing at High Reynolds Number,” Pui-Kuen Yeung and Diego Donzis, Georgia Institute of Technology
- 3:15 p.m. “Comprehensive Scientific Support of Large Scale Parallel Computation,” David Skinner, LBNL
- 4 p.m. – “Modeling Core-Collapse Supernovae at NERSC,” Douglas Swesty and Eric Myra, Department of Physics and Astronomy, State University of New York at Stony Brook
Wednesday, November 10
- 10:45 a.m. – “Performance Understanding, Prediction, and Tuning at the Berkeley Institute for Performance Studies,” Kathy Yelick, LBNL and UC Berkeley
- 11:30 a.m. – “Performance of a Cosmology Package on Leading Vector and Superscalar Architectures,” Lenny Oliker, LBNL
- 1 p.m. ‑ “The Advanced Networks and Services Underpinning Modern, Large-Scale Science: DOE’s ESnet,” William E. Johnston, LBNL
- 1:45 p.m. – “Overview of the Bro-Lite Intrusion Detection System,” Brian Tierney, LBNL
- 2:30 p.m. – “The Return of the Cube: Spinning the Security of SCinet,” Stephen Lau LBNL
- 3:15 p.m. – “Keeping Dirty Computers Off the Network with NETS,” James Rothfuss, LBNL
- 4 p.m. – “FusionGrid: Bringing the Sun to the Earth,” David Schissel, General Atomics
Thursday, November 11
- 10:45 a.m. – “Twelve Ways to Fool the Masses: Back to the Future,” David Bailey, LBNL
- 11:30 a.m. ‑ “Scaling First Principles Nanoscience and Materials Science Codes to Thousands of Processors,” Andrew Canning, LBNL
- 1 p.m. – “PDSF CHOS,” Shane Canon, LBNL
Berkeley Lab will also feature the following demonstrations in its booth:
Spinning Cube of Potential Doom, Stephen Lau, NERSC/LBNL. Using the Bro Intrusion Detection System to monitor SCinet, SC's high performance network, the Cube displays the continuous volume of potentially malicious traffic on the open Internet.
Bro-Lite Intrusion Detection System. Brian Tierney, LBNL Computational Research Division. Bro-Lite is a new release of the Bro Intrusion Detection system. Bro-Lite is designed to be easier to install and to use, and includes a number of new features, including attack signature matching.
Is Your Computer Dirty? Let NETS Find Out, James Rothfuss LBNL Information Technologies and Services Division. One of the best technologies developed for spreading worms and viruses is mobile computing. The busy, on-the-go laptop can gather plenty of infections during its travels and bring them all back home. Berkeley Lab is developing NETS to automatically detect infected and/or vulnerable systems before they jump onto the network.
PDSF CHOS, Shane Canon, NERSC/LBNL. CHOS is a framework for concurrently running multiple Linux environments (distributions) on single node. This is accomplished through a combination of the chroot system call, a Linux kernel module, and some additional utilities. It can be configured so that users are transparently presented with their selected distribution on login.
Serial, Parallel and Distributed Checkpoint/Restart for Linux, Paul Hargrove, LBNL Computational Research Division. Researchers in Berkeley Lab's Future Technologies Group are developing a new system-level implementation of checkpoint/restart for Linux clusters as part of the SciDAC Scalable Systems Software Center. The goal is to support checkpointing of a wide range of scientific applications without requiring modifications to the application code. This demonstration will highlight the capabilities of the current version, including single- and multi-threaded processes and distributed LAM/MPI jobs. These checkpointing capabilities are available both as a stand-alone tool, and as an integrated part of the Scalable Systems Software Suite.
ViCE – collaborative visual programming and scientific data analysis environment, David Konerding, Deb Agarwal, Brian Tierney, Keith Jackson, Karlo Berket, LBNL Computational Research Division. We will demonstrate domain-specific (computational biology and computational chemistry) workflows implemented as visual programs, as well as collaborative features, including group and private conversations, automatic archiving of workflow design and execution, and multi-user control of executing workflows. Archived workflows will be published in a standardized format allowing scientist-peers to download and reproduce workflows on their own resources. Integrated logging will be used to track workflows executing on disparate resources and facilitate debugging of distributed components.
Certificate and Access Management for the FusionGrid, Mary Thompson, LBNL Computational Research Division, David Schissel, General Atomics. The SciDAC FusionGrid project has been developing and deploying a set of Grid-enabled tools which provide secure access to compute, data storage and visualization services. We will demo the credential-handling interface and the interface to the ROAM authorization database. When a credential is created, the user is automatically added to the FusionGrid's centralized authorization database. That database can be managed through a Web interface. Users can check for their permissions, and authorized parties can add new resources and grant access to resources.
Robust Terabyte-Scale Multi-file Replication over Wide-Area Networks, Alex Sim, Arie Shoshani, Eric Hjort, Doug Olson, LBNL Computational Research Division. Typically, large scientific datasets (order of terabytes) are generated at large computational centers, and stored on mass storage systems. However, large subsets of the data need to be moved to facilities available to application scientists for analysis. File replication of thousands of files is a tedious, error-prone, but extremely important task in scientific applications. The automation of the file replication task requires automatic space acquisition and reuse, and monitoring the progress of staging thousands of files from the source mass storage system, transferring them over the network, and archiving them at the target mass storage system or disk systems. We have used Storage Resource Manager (SRM) technology to achieve robust file replication for several scientific domains. A robust replication system, called DataMover, is now in regular use in high-energy-physics and climate modeling experiments. The SRM monitors the staging, transfer and archiving of files, and is able to recover from transient failures. Only a single command is necessary to request multi-file replication or the replication of an entire directory. A Web-based tool was developed to dynamically monitor the progress of the multi-file replication process.
Fastbit Indexing Applied to 3D Visualization, Kurt Stockinger, John Shalf, John Wu, LBNL Computational Research Division. FastBit is a software tool that provides efficient high-dimensional searching capabilities based on bitmap indices. It relies on the fact that most scientific data is append-only, and uses a specialized compression method developed at LBNL to achieve search speeds faster by a factor of 10 over the best known high-dimensional indexing methods. FastBit can be applied to spatio-temporal data. It uses the bitmap technology for region finding based on multiple features of the mesh points, such as “temperature” and “pressure”. It also provides very efficient region-growing algorithms using the bitmaps for both 2D and 3D data. These algorithms allow one to search for regions of interest such as stellar objects in astrophysics or flame fronts in combustion studies. In this demo we are combining FastBit with 3D visualization. In particular, we demonstrate how to perform multi-dimensional queries for identifying regions of interest of supernova studies. Traditional visualization frameworks usually operate only on one-dimensional queries. By integrating FastBit into the visualization framework, it is now possible to perform interactive, efficient feature-based analysis and region finding for high-dimensional queries. By displaying the resulting regions of interests, the application scientists can quickly identify characteristic features of their data analysis.
Using the ACTS Collection, Tony. Drummond and Osni. Marques, LBNL Computational Research Division. Here you will find 6 different demos that highlight some of the many features of the DOE ACTS Collection Project.
1. PyACTS: a Python-based interface to some of the numerical tools in the ACTS Collection
2. MATAPPS: A matrix of high-end scientific and engineering applications using tools in the ACTS Collection
3. The ACTS Information Center: how to find information
4. ACTS Support: Install ACTS tools in your laptop or your home computers and try out a tutorial and examples.
5. TAU: A performance analysis tool in the ACTS Collection
Don't forget to get a free ACTS Collection CD with lots of tutorials and background material to help you develop high performance scientific computing applications.
Visualization of Electron Walkers Computed by Quantum Monte Carlo Simulation of Energy Pathways in Photosynthesis Reactions, William Lester, LBNL/UC Berkeley, Cristina Siegerist & John Shalf, LBNL Computational Research Division. This INCITE project, led by William A. Lester, Jr. of LBNL and UC Berkeley, aims to increase understanding of the complex processes that occur during photosynthesis, the process by which plants and bacteria convert the sun's light into energy. As part of a multidiscplinary collaborative team, the NERSC Visualization Group performs advanced development to create and apply visualization technology aimed at increasing scientific insight into the complex output from Incite1's Quantum Monte Carlo simulation that models electron pathways through carotenoids and chlorophyll during a photosynthesis reaction.
Visualization Helps Provide Insight into 3D Fluid Turbulence and Mixing at High Reynolds Number, P. K. Yeung, Georgia Institute of Technology, Cristina Siegerist & John Shalf, LBNL Computational Research Division. In a collaboration with P.K. Yeung of Georgia Tech, the NERSC Visualization Group documents the evolution of visual data analysis methods and techniques that reveal never-before-seen features in high resolution data now being produced by the INICTE3 project computed at NERSC.
Visualization of the Electron-Cloud Effect (21st Century Accelerator SciDAC), Andreas Adelmann, Paul Scherrer Institut, Switzerland,Cristina Siegerist, LBNL Computational Research Division. This project, a collaboration with Andreas Adelmann from PSI, aims at studying the electron cloud instabilities that can disrupt the main accelerator beam. The simulation results, which consist of particles' (protons, electrons) position and phase (velocity), are large and complex datasets that are difficult to manipulate and understand. The NERSC Visualization Group developed and applied technologies to facilite rapid exploration and visual analysis of accelerator modeling simulation results. Our SC04 demonstration illustrates visualization techniques that aid in gaining scientific insight, as well as architectures for remote and distributed visualization of large accelerator modeling simulation output.
LBNL Remote, Distributed and High Performance AMR Visualization, Phil Colella, LBNL; Ravi Samtaney, PPPL; Tom Adel, Stanford University; John Shalf and Wes Bethel, LBNL; Oliver Kreylos, UC Davis. Adaptive Mesh Refinement (AMR) is a technique for automatically refining (or de-refining) regions of a computational domain during a numerical calculation based upon application-specific criteria, like flamefront tracking during a combustion simulation. The multiresolution and hierarchical nature of AMR grids presents special challenges for mainstream visualization tools, which typically can operate only on single grid domains. At SC04, the LBNL Visualization Group will show ongoing AMR visualization activities. First, LBNL's hardware-accelerated volume renderer is being used to create images for a PBS program on cosmology. Second, the group will demonstrate use of custom data converters that permit AMR grids to be visualized using CEI's Ensight and LLNL's Visit, both of which are applications that implement a pipelined/parallel architecture and are effective in remote and distributed visualization contexts.
Visualization of Computational Atomic Physics for Fusion, Mitch Pindzola, Auburn University; Cristina Siegerist, LBNL Computational Research Division. Atomic physics plays a central role in many of the high-temperature and high-density plasmas found in magnetic and inertial confinement fusion experiments, which are crucial to our national energy and defense interests, as well as in technological plasmas important to the U.S. economic base. In turn, the development of the necessary atomic physics knowledge depends on advances in both experimental and computational approaches. The Computational Atomic Phyics for Fusion SciDAC project hosted at NERSC is producing early results simulating time evolution of a wavepacket scattering from a Helium atom. Our SC04 demonstration reflects collaborative effort between M. Pindozla (Auburn University) and the NERSC Visualization group to create visualizations of these early results.
Visualization of three-dimensional reconstructions of Electron Tomography Studies of Bacterial Structure and Function, Ken Downing, LBNL/UC Berkeley; Cristina Siegerist, NERSC/LBNL Scientific Visualization. In a collaboration with Ken Downing of LBNL's Life Sciences Division, the NERSC Visualization Group documents the process of identifying, visualizing and analyzing cellular-level structures observed in data acquired through electron tomography.
Meeting Challenges in Molecular Biology Through Interactive Visual Data Analysis (ProteinShop), Silvia Crivelli, NERSC/LBNL Scientific Visualization. Predicting the three-dimensional structure and function of proteins from fundamental chemical building blocks is one of the grand challenges in biology. Understanding molecular structure and function will yield new insights into fundamental biochemical processes that in turn create the opportunity for new scientific and medical advances. LBNL researchers will demonstrate recent technological advancements that help to accelerate protein structure determination using computational approaches. The SC04 demonstration will focus on visual data analysis processes that aid in understanding the relationship between the folding process of a protein molecule its internal free energy.
Immersive Information Visualization for Exploration, Discovery and Analysis, Steve Smith, LANL/LBNL. This project aims to leverage several technologies to aid in visual data analysis of several different types of information, with an emphasis on cyber security applications. We will show ongoing work in collaboration between LANL, UNM and LBNL on several projects: network monitoring for intrusion detection, domain name heirarchy analysis, supercomputing architecture and dynamics, transaction analysis, and critical infrastructure protection decision support systems. The Flatland virtual reality framework developed at UNM and the Flux data flow environment developed at LANL were used to develop these real-time, interactive, immersive demonstrations. Simulation and real-time data are from the ASCI program, NNSA , DHS, and LANL and LBNL network security projects.
MBender: Leveraging QuickTime VR as a Delivery Vehicle for Remote and Distributed Visualization, Jerry Chen, LBNL/SFSU; Wes Bethel, LBNL. In remote and distributed visualization settings, the viewer is in one location while the data to be analyzed or visualizations are located somewhere else. Typically, "somewhere else" is in a high performance mass storage system located at the large supercomputer center where the simulation was run or experimental data cached. Research in remote and distributed visualization focuses on alternative partitionings for the visualization pipeline with the aim of maximizing performance through the pipeline. This project explores the use of the industry-standard QuickTime VR Object Movie as a delivery vehicle for interactive, 3D visualization. QTVR is an attractive media format because it allows 3D interaction with time-varying, pre-computed visualization results.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.