A-Z Index | Phone Book | Careers

InTheLoop | 04.30.2012

April 30, 2012

NERSC Releases Mobile Apps to Users

In an effort to make NERSC resources more accessible to users, the facility is rolling out a number of applications that allow researchers to access scientific data on their web browsers, tablets, and smart phones. This month NERSC announced two new applications now available to its users:

  1. The NERSC mobile user portal allows researchers to check the current status of NERSC systems, user Message of the Day (MOTD), as well as log into their account to view recently completed, queued and running jobs, on their mobile phones.
  2. The NOVA portal is an experimental web application that allows licensed VASP users to submit jobs to NERSC systems. VASP (Vienna Ab initio Simulation Package) is a computer program for atomic-scale materials modeling. Based on user feedback with this early system, the team hopes to provide additional features in the future. For specific NOVA questions, please contact Annette Greiner.

In the coming months, NERSC will continue to improve these applications and add create more products for the web. For specific comments or suggestions, contact consult@nersc.gov.

For more information on these applications, go here.


Vint Cerf and Van Jacobson Inducted into Internet Hall of Fame

Vint Cerf, a member of the ESnet Policy Board, and Van Jacobson, former head of Berkeley Lab’s Network Research Group, are in the first class of inductees into the Internet Hall of Fame. Cerf is honored as a Pioneer, and Jacobson as an Innovator. Read more.


Eli Dart Answers Questions About ESnet’s Science DMZ

As science becomes increasingly data-intensive, ESnet is helping research institutions fully capitalize on the growing availability of bandwidth by encouraging them to use a network design model called the “Science DMZ.” The Science DMZ is a specially designed local networking infrastructure aimed at speeding the delivery of scientific data. In this interview, Eli Dart, who leads the Science DMZ effort at ESnet, answers some basic questions about the project. Read more.


Berkeley Lab Hosts Albany High Students on Job Shadow Day

For the fourth year in a row, Berkeley Lab staff hosted juniors from Albany High School as part of the school’s annual Job Shadow Day. The event matches students with mentors in areas of interest specified by the students. This year, Computing Sciences hosted six students, while other Lab organizations hosted another six. Computing Sciences Communications Manager Jon Bashor and EETD’s Jonathan Slack, whose children attend Albany High, are parent volunteers on the school committee matching students and mentors for the event. Read more.


Roxanne Clark Wraps Up 28-Year Career of Keeping Movers and Shakers on Track

Throughout her 28-year career at Lawrence Livermore and Lawrence Berkeley national laboratories, Roxanne Clark built a strong track record of supporting senior managers who went on to bigger and better things. Read more.


13th ACTS Collection Workshop Will Be Held August 14–17

The 13th Workshop on the DOE Advanced Computational Software (ACTs) Collection: Scalable and Robust Computational Libraries and Tools for High-End Computing will be held August 14–17, 2012 at Berkeley Lab. The workshop will include a range of tutorials on the tools currently available in the collection, discussion sessions aimed to solve specific computational needs by the workshop participants, and hands-on practices using NERSC’s state-of-the-art computers.

The ACTS Collection comprises a set of noncommercial software tools mainly developed at DOE laboratories, sometimes in collaboration with universities. These tools aim to simplify the solution of common and important computational problems, and have substantially benefited a wide range of applications and fields in computational sciences. These benefits include not only applications running efficiently on high performing computing environments but also realizing computation that would not have been possible otherwise. Nowadays, with the introduction of hardware technologies such as multicore, the use of software libraries is key in the development of high end software applications. This is because libraries provide a level of software abstractions in which the robustness, scalability and portability can be easily and reliably passed into the application codes across a large class of computer platforms.

The workshop is open to computational scientists from industry and academia. Registration fees are fully sponsored by the DOE Office of Science. In addition, DOE will sponsor travel expenses and lodging for a limited number of graduate students and postdoctoral fellows. Applications are due June 24, 2012. For more information on the workshop, go here, or contact Tony Drummond at (510) 486-7624.


This Week’s Computing Sciences Seminars

Self-Association of Therapeutic Monoclonal Antibodies: A Coarse-Grained Perspective on the Role of Electrostatics and More
Monday, April 30, 10:00–11:00 am, 50F-1647
Anuj Chaudhri, University of Chicago and Genentech Inc.

Immunotherapy serves to improve the immune system to treat infectious diseases either actively through vaccines or passively through therapeutic monoclonal antibodies (MAbs). These monoclonal antibodies require very high doses (<1 mg/kg) to be administered to a patient due to potency issues. The usual route of administration till now has been intravenous, but this leads to higher patient care costs and the need for skilled workers. On the other hand, subcutaneous delivery (SC) is a more convenient route of administration. However, this route poses an upper limit on the dosage volume that can be administered at a given time, typically <1.5 ml, and thus necessitates the development of formulations for SC administration at concentrations >100 mg/ml. High concentration protein solutions pose formulation challenges such as high viscosity during manufacturing, protein stability issues leading to association and aggregation, degradation of the drug and cause undesired immunogenic responses in the body. Hence, understanding the issues of self-association and aggregation can lead to more stable therapeutic drugs that can be administered in a more risk-free manner with limited patient costs.

Coarse-grained computational models of two therapeutic monoclonal antibodies are constructed to understand the effect of domain-level charge-charge electrostatics on the self-association phenomena at high protein concentrations. The coarse-grained representations of the individual antibodies are constructed using an elastic network normal mode analysis. Two different models are constructed for each antibody for a compact Y-shaped and an extended Y-shaped configuration. The resulting simulations of these coarse-grained antibodies that interact through screened electrostatics are done at six different concentrations. It is observed that a particular monoclonal antibody (hereafter referred to as MAb1) forms three dimensional heterogeneous mesophase structures with dense regions compared to a different monoclonal antibody (hereafter referred to as MAb2) that forms more homogeneous structures. These structures, together with the potential mean force (PMF) and radial distribution functions (RDF) between pairs of coarse-grained regions on the MAbs, are qualitatively consistent with the experimental observation that MAb1 has a significantly higher viscosity compared to MAb2, especially at concentrations >50 mg/ml. It is also observed that the structures in MAb1 are formed due to stronger Fab-Fab interactions in corroboration with experimental observations. The coarse-grained representations are effective in picking up differences based on local charge distributions of domains and make predictions on the self-association characteristics of these protein solutions. This is the first computational study of its kind to show that there are differences in structures formed by two different monoclonal antibodies at high concentrations. The methodology is being further developed to understand the effect of swapping charges on these MAbs by creating mutations at the antigen binding sites and the role of Fab-Fc interactions using screened electrostatic potential energy maps and CG simulations.

Visualization and Analysis of Large Data From Simulations
Wednesday, May 2, 12:00–1:00 pm, 310 Sutardja Dai Hall (Banatao Auditorium), UC Berkeley
Hank Childs, LBNL/CRD/NERSC

Visualization and analysis are critical to the success of the simulation process; they help realize the value of computing by increasing the rate at which new science is discovered. Their techniques are used to confirm that simulations are running correctly, to communicate simulation results to an audience, and, most importantly, to explore data, which is often where new insights are obtained.

As supercomputers get ever larger, simulations are producing increasingly massive data sets, create two major challenges for visualization and analysis: (1) how to handle the scale of the data and (2) how to reduce its complexity to produce results that will truly enable insight. In this presentation, I will primarily focus on the scale issue and describe the barriers to scalable parallel performance. Finally, this field is rapidly changing, as supercomputers will soon be heavily power-constrained. I will describe why visualization and analysis processing techniques must evolve and how.

Convex Approaches to Text Mining: LAPACK Seminar
Wednesday, May 2, 12:10–1:00 pm, 380 Soda Hall, UC Berkeley
Brian Gawalt, UC Berkeley

Many text mining tasks—ranking, classification, clustering—can be posed as the solving of a convex optimization problem over a vector-space model of the document set. However, the models fit by these methods are often dense, with one parameter fit per ngram token, creating models unsuitable for human understanding. When these text mining models are made sparse, via stringent l1 regularization, they yield valid, fully interpretable summaries of the underlying documents. The preserved convexity of the underlying problem suggests opportunity to scale to large corpora of documents.

The presentation will begin with an example vector-space text retrieval task, proceed to an introduction of the predictive (semi- to fully supervised learning) framework for summarization, and present results from a human validation experiment showing the merit of the sparse and convex approaches. It includes an extension of a simple predictive algorithm to a MapReduce framework for large-scale summarization and concludes with an exploration of a sparse topic modeling (unsupervised) approach to summarization.

Six Things Scientists Can Learn from Science Journalists
Thursday, May 3, 12:00–1:00 pm, 90-3122
Maggie Koerth-Baker, BoingBoing.net

When you talk about your research, do you feel like you’re talking to yourself? Have ever accidentally left a lay person more confused than they were before they met you? Does your left eye go twitchy every time a journalist calls? Communicating science is scary. Fortunately, the same lessons that turn cringe-worthy journalism into smart science reporting can help you do a better job of communicating your own work—whether directly to the public, or to journalists themselves. Don’t freak out. Don’t give up. Instead, come to this presentation.



About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.