A-Z Index | Phone Book | Careers

InTheLoop 11.16.2015

November 16, 2015

Ceremony Officially Opens Wang Hall

The Shyh Wang Hall computational research and theory facility officially opened its doors Thursday with a dedication ceremony and symposium. UC President Janet Napolitano, DOE Office of Science Facilities Division Director Barbara Helland and UC Berkeley Chancellor Nicholas Dirks all spoke at the ceremony overseen by Berkeley Lab Director Paul Alavisatos, Deputy Lab Director Horst Simon and Associate Lab Director for Computing Sciences Kathy Yelick. With the U.S. Congress in session, Senator Diane Feinstein and Representative Barbara Lee offered their congratulatory remarks via video recording. Special guest Dila Wang, widow of UC Berkeley Emeritus Prof. Shyh Wang for whom the building was named, helped light up a ceremonial fiber optic connection and offered brief remarks. An afternoon symposium and evening poster session and refreshments rounded out the day's activities. Staff who were unable to join the overbooked tours can reschedule by emailing David Skinner at DESkinner@lbl.gov.

»See photos.  »Read the press release.

Berkeley Lab Researchers Spark Big Data Analytics

A team of scientists from Berkeley Lab’s Computational Research Division has been awarded a two-year, $110,000 grant by Intel to support their goal of enabling data analytics software stacks—notably Spark—to scale out on next-generation high performance computing systems. Functioning as an Intel Parallel Computing Center (IPCC), the new research effort will be led by Costin Iancu and Khaled Ibrahim, both computational scientists in CRD’s Computer Languages and Systems Software Group.

Spark is an open source computing framework for processing large datasets. It was developed in 2009 in the University of California, Berkeley’s AMPLab by then Ph.D. student Matei Zaharia and went open source in 2010 before being donated to the Apache Software Foundation in 2013. Spark’s ability to cache datasets in memory makes it well suited for large data analysis, especially on systems with large memory space. »Read more.

Berkeley Lab at SC15

This week, many Berkeley Lab staffers are at SuperComputing 2015 in Austin, Texas where they are sharing their research, accepting awards, demonstrating their expertise and inspiring the next generation of computational and computer scientists.

»Technical Program Participation 

»ACM/IEEE Ken Kennedy Award: Kathy Yelick

»Test of Time Award: Horst Simon & David Bailey

»Talks, Rountables and Demos (DOE Booth: 502)

»Outreach Activities

Other Berkeley Lab and ESnet Demos include the following:

"Emulating Future HPC SoC Architectures" with Farzad Fatollahi-Fard, David Donofrio and John Shalf (Berkeley Lab).

  • Where: Room 14
  • When: 9 a.m.– 5:30 p.m., Tuesday – Thursday, Nov. 17–19

“Science DMZ as a Service” with Inder Monga (ESnet), Ilya Baldin (RENCI) and Craig Tull (Berkeley Lab).

  • Where: RENCI booth (181)
  • When: 2:30 - 3:30 p.m., Tuesday, Nov.17 & Wednesday, Nov.18 1:30 - 2:30 p.m., Thursday, Nov. 19

“Software-Defined Networking” wtih Inder Monga (ESnet) and Eric Pouyoul (ESnet).

  • Where: Corsa Technology booth (364)
  • When: 7 - 9 p.m. Monday, Nov. 16 10:30 a.m. -12 p.m. & 4:30 - 5:30 p.m. Tuesday & Wednesday, Nov. 17 - 18 10:30 a.m. - 12 p.m. Thursday, Nov. 19

“Early Experiences Optimizing Applications for the NERSC Cori Supercomputer” panel discussion with Katie Antypas and Richard Gerber (NERSC).

  • Where: Intel booth (1333)
  • When: 2:30pm on Wednesday, Nov. 18

Trillion-particle Simulations Take on the Biggest Big Data

At SC15 in Austin this week, a collaboration between scientists in Berkeley Lab's Computational Research Division, NERSC and Intel's Parallel Computing Lab are presenting their first-fruits:  A plasma-physics simulation of 1.4 trillion particles run in 30 minutes on roughly 100,000 processor cores on Edison, a Cray supercomputer operated by NERSC.

Billed as "the first end-to-end clustering system that deals with a trillion particles and tens of terabytes of data,” the group's work is a triumph of collaborative science, not only between private and public entities, but between computer scientists and engineers, and theoretical and applied scientists. “This demonstrates a successful collaboration between several groups: algorithm developers and code optimization experts at Intel, applied math researchers at MANTISSA, and computer science researchers working on ExaHDF5. And this has successfully solved one of DOE’s leading big-data analytics problems on NERSC platforms,” said Prabhat, a team member and NERSC's group leader for Data and Analytics Services.

Other Berkeley Lab contributors are Suren Byna of CRD's Scientific Data Management Group, Zarija Lukic of the Computational Cosmology Center,  and Yushu Yao of NERSC. »Read the ASCR Discovery article.

Bethel Outlines Exascale Visualization Challenges at New SIGGRAPH Symposium

Wes Bethel, group lead for the Visualization Group in Berkeley Lab’s Computational Research Division, recently traveled to Kobe, Japan to give an invited talk at SIGGRAPH Asia’s first Visualization in High Performance Computing symposium.  His presentation focused on the U.S. exascale R&D agenda for visualization and analytics in high-performance computing (HPC).

“Each new generation of HPC platforms introduces a new set of challenges for data-intensive activities, which include scientific data management, analysis, and visualization (SDMAV),” Bethel said. “As we prepare for the Department of Energy’s Exascale Computing Initiative, we are carefully planning for several different types of SMDAV research challenges, some of which are the direct result of disruptive changes caused by the evolving computational architecture, while others are the result of how science will evolve to leverage this powerful new computational platform.”

Held November 2-5, the 8th ACM SIGGRAPH Asia Conference and Exhibition brought together more than 7,000 computer graphics professionals, technologists, researchers and industry from 49 countries. The new Symposium on Visualization in High Performance Computing covered the development, technology and demonstration of visualization techniques and their interactive applications.

Dosanjh Delivers Invited Address to International Workshop on CoDesign

NERSC Director Sudip Dosanjh delivered an invited address to the fifth International Workshop on CoDesign held November 9-11 in Wuxi, Jiangsu, China. The workshop was held in conjunction with HPC China, one the largest annual domestic conferences on HPC in China. Dosanjh's talk, "Cori: A Pre-exascale System for Science," offered background on NERSC and delved into Cori and the center's work towards exascale systems. »Download slides from the talk (PDF).

Prahbat Pens 'Big Science Problems, Big Data Solutions'

In a guest-blog for tech media company O'Reilly, NERSC's Prabhat recently outlined ten big data problems faced by scientists around the world and what NERSC is doing to address those. »Read more.

This Week's CS Seminars

Applied Math Seminar: Topology in Band Theory and Real Materials

Wednesday, Nov. 18 , 3:30–4:30pm, 939 Evans Hall, UC Berkeley
Lukas Muechler, Princeton University

In this talk I will explain the notion of topological non-trivial band structures in real materials. In the first part I will give examples of parallel transport and holonomies in classical and quantum physics. By relating these concepts to the electronic structure of periodic solids, I will then explain topological non-trivial states of matter such as topological insulators as well as Weyl and Dirac metals. Furthermore computational challenges as well as open problems regarding these topics will be discussed.

TRUST and CLTC Security Seminar: How Technologists Inform Policy

Thursday, Nov. 19, 1–2 pm, 290 Hearst Memorial Mining Building, UC Berkeley
Ashkan Soltani, Federal Trade Commission

The Federal Trade Commission is the leading federal agency responsible for protecting privacy online. You might be familiar with the FTC's privacy and data security enforcement actions against some of the world's biggest tech companies, but what you may not know is how privacy and security research can inform the FTC's investigations. In fact, technology is at the core of the Commission's work. Come hear about some of the Commission's recent tech-related work and learn how technologists can inform policy, guide business, and help protect consumers as we transition to an always-on / always-connected world. Ashkan Soltani is a researcher focused on privacy, security, and behavioral economics, currently serving as the Chief Technologist for the Federal Trade Commission. Sponsored by the Team for Research in Ubiquitous Secure Technology (TRUST) and the Center for Long-Term Cybersecurity.

BIDS Data Science Lecture—Managing Complexity: Challenges of Modeling in Integrative Systems Biology

Friday, Nov. 20, 1–2:30pm, 190 Doe Library, UC Berkeley
Nancy J. Nersessian, Harvard University & Pittsburgh Center for Philosophy of Science

Over the last 10 years, there has been a rapid growth in analyses of computational modeling and simulation in the philosophy of science. Research on simulations has concentrated largely on simulations built using established background theories or theoretical models and the relations between these simulations and theory. Examples have been sourced mainly from the physical sciences, including astrophysics, nanophysics, and climate science.
My research group’s five-year ethnographic investigations of modeling practices in integrative systems biology have revealed that not all equation-based modeling is theory driven.The modelers we have studied have no background body of laws and principles of the biological domain, which could provide the resources for constructing models. In the labs we investigated, engineers and applied mathematicians with little biological knowledge and usually no experimental experience attempt to model complex nonlinear biological networks for which the data are often sparse and are rarely adequate for applying a set mathematical framework. Models are strategic adaptations to a complex set of constraints system biologists are working under, ranging from data constraints to cognitive constraints to collaboration constraints. I argue that simulation in systems biology is not, as currently characterized, just for experimenting on systems in order to find out the consequences of a model but plays a fundamental role in incrementally building the model, enabling the modeler to learn the relevant known and unknown features of a system and to gain an understanding of and make inferences about its dynamics. Simulation’s roles as a cognitive resource make possible the construction of representations of complex systems without a theoretical basis. Through the building process, modeler and model become a coupled cognitive system, which enables a modeler with limited knowledge of biology to make fundamental biological discoveries, as we have witnessed.