InTheLoop | 02.23.2015
NERSC Hosts Science Talks Tomorrow at the Lab
On Tuesday, Feb. 24, the NERSC Users Group (NUG) is hosting a series of science talks in the Bldg. 50 auditorium at Berkeley Lab. Part of the annual NUG meeting, these talks are open to all interested lab staff, not just NUG attendees. Among the presentations are keynote talks on “Efficient modeling of laser-plasma accelerators using the ponderomotive-based code INF&RNO” by Carlo Benedetti of Berkeley Lab’s BELLA Center and the LOASIS Program, and “Transforming Beamline Science with SPOT Suite” by CRD’s Craig Tull. NERSC’s Katie Antypas will discuss “Cori: A Next-Generation System for Computational and Data Science” and CRD’s Phil Colella will give a talk on “Programming Next-Generation HPC Systems.” »See the full NUG meeting agenda.
CRD Volunteer Supports Worldwide Girls Hackathon
As part of the Black Girls Code team of volunteers, Dani Ushizima of the Computational Research Division spent her President's Day weekend coaching girls participating in the Ignite International Girls Hackathon. During the event, sponsored by the Global Fund for Women and United Nations Women, girls from all over the world spent 24 intense hours hacking solutions to a challenge facing women and girls everywhere: access to safe spaces. Safe spaces include spaces where girls can be free from violence, where they are supported by peer and adult role models, where they have access to education and learning, and where girls’ voices are heard and amplified free from fear or threats.
Ushizima, the only Black Girls Code Oakland mentor who volunteered for the entire event, said, "as a woman scientist, it is important for me that young girls from the Bay Area are not left on the sidelines." »KTVU Channel 2 featured the hackathon on its nightly news broadcast and a French television crew covered the Oakland team, which will also be featured in a Global Fund for Women video about the event. »Ushizima has volunteered with the Oakland chapter of Black Girls Code for previous software development events.
BIDS Calls for Data Science Fellow Applications
Berkeley Institute for Data Sciences invites applications their next cohort in the BIDS Data Science Fellow Program.
Successful applicants will join our current cohort of fellows in helping make data analysis easier in the research sciences. BIDS data science fellows are postdoctoral scholars, graduate student researchers, or staff with excellent credentials in their fields as well as strong interests in advancing data-analysis approaches with a community of like-minded individuals from across campus.
Applications are currently being accepted for "joint campus appointment" and "graduate student researcher." A small number of full-time postdoctoral positions will be available in addition to joint campus appointments. However, recruitment won't being until early March 2015. To be notified about the postdoc recruitment, please send your contact information to email@example.com.
Applications are currently limited to individuals with backgrounds and research interests in the social, physical, and life sciences and/or statistics, applied math, and computer science. Eligibility will be determined by the applicants’ research focus, not their home department.
Data does not have to be “big data” for program eligibility. The institute encourages applications from cross-disciplinary groups and from individuals exploring topics that may broaden the research diversity of the BIDS community (e.g., genetics, economics, and the Internet of Things). »Learn more and apply.
Nominations Open for Lab Awards
Nominations are now being accepted for the LBNL Director’s Awards for Exceptional Achievement and the Berkeley Lab Prize-Lifetime Achievement Awards. These awards were established to recognize teams and individuals for significant accomplishments advancing the lab’s mission and strategic goals. Any lab employee may submit a nomination, due March 15. Honorees will be acknowledged at a lab awards ceremony and reception held in the summer of 2015. For details on eligibility, criteria, instructions, and to submit a nomination, please »visit the LBNL Employee Recognition web site.
This Week's CS Seminars
Indexing Big Data
Monday, Feb. 23, 10 – 11 a.m., Bldg. 50F, Room 1647
Michael A. Bender, Stony Brook University & Tokutek, Inc and Rob Johnson, Stony Brook University
This talk addresses the problem of ingesting and indexing massive data sets. Traditional storage systems (databases, file systems, document stores) based on B-trees, are I/O-bound on many workloads. This talk explains how write-optimized indexing can dramatically reduce the number of I/Os associated with some traditional workloads, enabling big-data applications to scale by orders of magnitude. The talk explores write-optimization from the perspective of the foundational theory, implementation, and technology transfer.
Numerical Simulation of Turbulent Two-Phase Flows Using Interface-Capturing Schemes
Monday, Feb. 23, 2015, 11 a.m. – 12 p.m., NERSC OSF 943, Conference Room 236
Andrey Ovsyannikov, Center for Turbulence Research, Mechanical Engineering Department, Stanford University
In a variety of physical processes the discontinuity in physical properties is mimicked by the evolution of a fluid-interface. Examples include immiscible gas-liquid flows, premixed flames, solidification and melting phenomena and many others. For numerical simulation of flows with moving interfaces level-set methods are often used to capture the interface dynamics. Despite of many inherent advantages, the level set approach suffers from unphysical loss of the fluid mass as time evolves which is critical in the simulation of turbulent two-phase flows. The first part of this talk will be devoted to recent improvements of the level-set method developed in my Ph.D. dissertation. The main idea of the proposed approach is to introduce a new source term into the level-set equation. In continuous space, the proposed source term does not change the solution for a zero level-set, hence the interface dynamics analytically remains the same. In discrete space, the new form of the level-set equation provides several advantages. Compared to the standard approach with reinitialization procedure, the new approach leads to the less number of level set reinitializations. Due to the reduced number of reinitializations, improvement in the resolution of zero level-set and less errors in the mass conservation are achieved. The second part of the talk will be devoted to a particular example of the interface-capturing method (hybrid level-set/volume-of-fluid method) and high-performance computing for a large-scale numerical simulation of turbulent Couette flow. The objective of this work is to develop an understanding of bubble generation mechanisms due to the interactions between free surfaces and turbulent boundary layers as commonly seen near ship walls. I will present results from these calculations revealing the effects of bubbles on the modulation of turbulence and influence of the water depth on the amount of air-entrained. I will conclude my talk with an overview of required steps towards the numerical simulations of complex interfacial flows at realistic parameters. These steps include all aspects - from development of new physical models to optimization of numerical solvers and efficient usage of parallel algorithms for high-performance computing.
Software Support for Efficient Use of Modern Parallel Systems
Tuesday, Feb. 24, 10 – 11 a.m. , Bldg. 50B, Room 2222
Milind Chabbi, Department of Computer Science, Rice University
Achieving top performance on modern many-core, accelerated, multi-node systems is a daunting challenge. Production software systems suffer from performance losses at multiple layers of the software stack, which can be broadly attributed to three main causes: resource idleness, wasteful resource consumption, and insufficient tailoring for architectural characteristics. Effective software tools, adaptive runtime systems, and efficient algorithms developed as part of my doctoral dissertation address instances of performance problems due to each of the aforementioned causes. Tools and techniques that I developed have demonstrated their effectiveness by identifying performance losses arising from developer’s inattention to performance, inappropriate choice of data structures, inefficient algorithms, use of heavyweight abstractions, and ineffective compiler optimizations. This work has had practical impact through collaborations with various national laboratories, industrial, and academic partners. An adaptive runtime developed in collaboration with LBNL eliminated more than 60% redundant barriers in production runs of NWChem - a flagship DOE computational chemistry code. A fine-grained instruction monitoring framework pinpointed inefficiencies, which helped us substantially improve the performance of several important codes. Idleness analysis for heterogeneous architectures, developed as part of Rice University’s HPCToolkit performance tools, helped us diagnose and correct both hardware and software causes of performance losses for important codes running on accelerated supercomputers. Novel, architecture-aware synchronization algorithms deliver high throughput, high fairness, and superior scaling of highly contended locks on deep NUMA architectures.
CITRIS Seminar: Appropriate Technology for Sustainable Urban Development
Wednesday, Feb. 25, 12 – 1 p.m., Banato Auditorium, Sutardja Dai Hall, UC Berkeley Campus
Dr. Peter Highnam, IARPA Director
Sustainability advocates have long had conflicted reactions to technology. On one hand, many individuals going back to 19th century Luddites and 1960s and 70s “appropriate technology” pioneers have questioned whether many forms of technology are desirable or sustainable. On the other hand, other technologies such as those related to renewable energy, low-carbon transportation, ecological restoration, and social mobilization clearly play an important role in sustainable development. The debate over appropriate technology for the twenty-first century is very much with us currently as California considers whether to invest tens or hundreds of billions of dollars in technologies such as high-speed rail and ocean water desalinization. There are no easy answers to the questions surrounding “sustainability technology,” if we can call it that. However, this talk will review the historic tensions between “technology” and “sustainability,” identify key areas of conflict and synergy, and propose guidelines that may be useful for sustainable city technology in the future. Lunch provided for those who »register in advance.
Applied Math Seminar: A Conversation with the IARPA Director
Wednesday, Feb. 25, 1 – 2 p.m., HP Auditorium, 306 Soda Hall, UC Berkeley Campus
Dr. Peter Highnam, IARPA Director
Dr. Peter Highnam, Director of IARPA (Intelligence Advanced Research Projects Activity) will provide an introduction to IARPA, discuss its research priorities, share the best ways to participate in IARPA-funded applied research and to engage with its experts, as well as answer questions in an open forum. IARPA executes high-risk, high-payoff research for the U.S. Intelligence Community. It funds mostly applied research at universities and industry through open solicitation program calls.
Analysis of the Pollution Effect in Finite Element Discretization of Highly Heterogeneous Helmholtz Problems
Wednesday, Feb. 25, 2:30 – 3:30 p.m., 939 Evans Hall, UC Berkeley
Theophile Chaumont Frelet, INRIA and INSA de ROUEN, France
Time harmonic waves, modeled by the Helmholtz equation, are used in several engineering processes including, for instance, radar and seismic imaging. In the context of seismic imaging, waves propagate through the earth, which can be represented (in the simplest case) as an heterogeneous acoustic medium. Depending on the application, the simulation of high-frequency waves can be required, especially for high-resolution imaging. Numerical approximation of high-frequency waves is a challenging problem, even in homogeneous media. Indeed, numerical approximations suffer from the so-called pollution effect: if the number of discretization points per wave length is kept constant, the numerical solution diverges from the best approximation the scheme is capable of when the frequency is increasing. As a result, drastic conditions are imposed on the mesh at high frequency: the number of points per wave length must be increased when the frequency increases. In the case of homogeneous media, it has been shown (and observed numerically) that high order methods are able to reduce the pollution effect, making them cheaper than low order methods to solve for high-frequency. However, the application of high order methods to highly heterogeneous media is not trivial. It turns out that high order methods are build on coarser meshes, so that they do not capture fine scale variations of the propagation medium if the parameters are taken to be constant in each cell. The aim of this work is the study of a multiscale medium approximation strategy for high order methods. We propose a theoretical analysis of the pollution effect in the context of multiscale medium approximation when using polynomial shape functions and we present numerical experiments (including geophysical benchmarks).
Nonlinear Dynamics, High Dimensional Data, and Persistent Homology
Thursday, Feb. 26, 12 – 1 p.m., 50F, Room 1647
Konstantin Mischaikow, Rutgers University
It is almost cliché at this point to note that high dimensional data is being collected from experiments or generated through numerical simulation at an unprecedented rate and that this rate will continue rising extremely rapidly for the foreseeable future. Our interest is in data associated with nonlinear dynamics. The focus of this talk is on our efforts to use topological tools to characterize and classify high dimensional nonlinear time series associated with processes that exhibit complex spatiotemporal patterns. The long term goal is to develop robust efficient techniques for comparing experimental work against numerical simulation. I will introduce the necessary mathematical theory and provide examples arising from fluid flow and dense granular media.
BIDS Guest Lecture – OpenVDB: An Open Source Data Structure and Toolkit for High-Resolution Volumes
Thursday, Feb. 26, 3:30 – 5 p.m., 190 Doe Library, UC Berkeley
Ken Museth, DreamWorks Animation
VDB is an C++ library comprising a novel hierarchical data structure and a suite of tools for the efficient storage, rendering, and manipulation of sparse volumetric data discretized on three-dimensional grids. Following the open sourcing of VDB (http://www.openvdb.org) at SIGGRAPH 2012 and Houdini's full integration of OpenVDB since version 12.5, this presentation is targeting both developers who seek insight into the novel data structure and new adopters who simply wish to experiment with the new technology. OpenVDB ships with a rich toolset of high-level volumetric processing and conversion tools that can be applied directly in VFX pipelines. To this end, the presentation will focus on the following aspects of VDB: a technical description of its underlying data structure and algorithms; its accompanying toolset; and, finally, its integration into Houdini as a first-class citizen. OpenVDB and its predecessor VDB have to date been used in more than 70 feature movies, including all of the movies being nominated for an Oscar for the best visual effects in 2015. The talk is delivered by VDB’s inventor, Ken Museth, and takes point of reference in a technical paper presented at SIGGRAPH 2013.
BIDS Data Science Lecture: Municipal Governments' Use of Open Data
Thursday, Feb. 26, 3:30 – 5 p.m., 190 Doe Library, UC Berkeley
Michal Migurski, Code for America
The speaker will address municipal governments' use of open data, Code for America’s drive to get small and local governments up to speed on modern ways of working with technology, and the community benefits that come from new approaches to data and code.