InTheLoop | 12.01.2014
Optimized Algorithms Boost Combustion Research
Turbulent combustion simulations, which provide input to the design of more fuel-efficient combustion systems, have gotten their own efficiency boost, thanks to researchers from the Computational Research Division (CRD) at Lawrence Berkeley National Laboratory (Berkeley Lab).
Matthew Emmett, Weiqun Zhang and John Bell developed new algorithmic features that streamline turbulent flame simulations, which play an important role in designing more efficient combustion systems. They tested the enhanced code on the Hopper supercomputer at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) and achieved a dramatic decrease in simulation times.Their findings appeared in the June 2014 Combustion Theory and Modelling, and some of their simulations were featured on the journal cover. »Read more
ESnet Connections Peak at 270 Gbps Flow in, out of SC14 Conference
The booths have been dismantled, the routers and switchers shipped back home and the SC14 conference in New Orleans officially ended Nov. 21, but many attendees are still reflecting on important connections made during the annual gathering of the high performance computing and networking community, including ESnet. The DOE science network's infrastructure brought a combined network capacity of 400 gigabits-per-second (Gbps) to the Ernest Morial Convention Center with a peak flow of 270 Gbps. »Read more.
Antypas Featured in 'Ask Berkeley Lab'
Anyone reading this newsletter probably already knows what a supercomputer is, but could they explain it to the typical person-on-the-street? NERSC User Services Department Head Katie Antypas recently took on that challenge for a public affairs series called "Ask Berkeley Lab." In just under three minutes and with cameras rolling, Antypas explains what supercomputers are, what they do and why they're important to basic scientific research, using Edison as an example. »Watch the video.
This Week's CS Seminars
Applied Mathematics: Scalable variational inference for a generative model of astronomical images
Wednesday, Dec. 3, 2014, 3:30–4:30 p.m., 939 Evans Hall, UC Berkeley Campus
Jon McAuliffe, University of California, Berkeley
A central problem in astronomy is to infer the locations and other latent properties of stars and galaxies appearing in telescopic images. In these images, each pixel records a count of the photons—originating from stars, galaxies, and the background—that entered a particular region of a telescope's lens during an exposure. Each count is well modeled as a Poisson random variable, whose rate parameter is a deterministic function of the latent properties of nearby stars and galaxies. In this talk, I present a generative, probabilistic model of astronomical images, as well as a scalable procedure for inferring the latent properties of imaged stars and galaxies from it. Experimental results suggest that principled probabilistic models are a viable alternative to ad hoc approaches.