A-Z Index | Phone Book | Careers

InTheLoop | 11.25.2002

The Weekly Electronic Newsletter for Berkeley Lab Computubg Sciences Employees

November 25, 2002

LBNL-Led Team Wins Third Annual Bandwidth Challenge at SC2002 Conference

An international team led by Berkeley Lab notched its third consecutive
victory in the High-Performance Bandwidth Challenge at the SC2002
Conference held last week in Baltimore. The team won top honors for the
Highest Performing Application, moving data at a peak speed of 16.8
gigabits (16.8 billion bits of data) per second. The team used clusters
of computers at six sites in the United States, the Netherlands and the
Czech Republic.

Entitled "Wide Area Distributed Simulations using Cactus, Globus and
Visapult," the winning application modeled gravitational waves generated
during the collision of black holes. "It was great to win it again for
the third straight year," said Berkeley Lab's John Shalf, leader of the
team. This year's winning data rate was more than five times higher than
the team's record-setting win at the SC2001 conference. Other LBNL team
members are Wes Bethel, George "Chip" Smith, John Christman, Al Early,
Cary Whitney, Shane Canon, Mike Bennett and Eli Dart.

First held in 2000, the High-Performance Bandwidth Challenge encourages
teams of researchers from around the world to use, if not swamp, the
conference network to demonstrate applications using huge amounts of
data. The challenge is sponsored by Qwest Communications and provides
cash prizes for the winning teams in three categories.

Participating sites in the winning effort were the Parallel Distributed
Systems Facility at the National Energy Research Scientific Computing
(NERSC) Center at LBNL; and clusters at the SC2002 conference in
Baltimore, Argonne National Laboratory, the National Center for
Supercomputing Applications, the University of Amsterdam, and the
Masaryk University in the Czech Republic. Support was provided by the
Albert Einstein Institute in Germany, the Poznan Supercomputing and
Networking Center in the Czech Republic, DOE's Energy Sciences Network
(ESnet), Sandia National Laboratories, SysKonnect, Hewlett-Packard and
Force10 Networks Inc.

"We're happy to have helped Berkeley win again, especially since they
were our first customer," said Rob Quiros, marketing director for
Force10 Networks, makers of the E-Series high performance 10 gigabit
Ethernet switch used by the team. "This successful collaboration again
proves that Force10 continues to define high-performance networking for
demanding application by delivering true 10 Gig E performance."

Shalf said the team used a "global grid infrastructure testbed" to win -
a similar resource grouping tto the one that won two awards in the
SC2002 High Performance Computing Challenge. "As we build a more global
infrastructure, researchers will be able to choose from resources around
the world to increase their throughput," Shalf said.

The team ran the Visapult volume rendering application at SC2002 to
create visualizations from the simulations being run on the
participating clusters. Pulling in that data is what filled one OC-48
and two OC-192 lines feeding into the Baltimore Convention Center. The
team used a cluster of Compaq computers loaned by Hewlett-Packard and
SysKonnect network interface cards to put together the winning effort.

According to Wes Bethel, the head of LBNL's Visualization Group and
developer of Visapult, improvements in Visapult, along with the evolving
networking and Grid infrastructure of hardware, software and middleware
helped push the team's data transfer to such a high rate. Members of the
team tested the basic setup last July to demonstrate the feasibility of
10 gigabit Ethernet capability.

The LBNL-led team won the first ever Bandwidth Challenge at SC2000,
moving data at an average of 596 megabits per second over 60 minutes and
hitting a peak of 1.48 gigabits per second over a five-second period. At
the SC2001 conference, the team took the top prize by achieving a
sustained network performance level of 3.3 gigabits per second.


Reminder: Open Enrollment Ends at Midnight Saturday, November 30

This year's Open Enrollment for selecting health care providers and
other benefits ends at midnight on Saturday, November 30. For more
information and instructions on making changes, go to


Bruce Bargmeyer Gives Keynote at Software Conference in China

Bruce Bargmeyer of the Scientific Data Management Group in the
Computational Research Division gave a keynote presentation at the Asian
Forum for Business Object & Software Components 2002 (BOSC 2002). The
meeting was held at the State Key Laboratory of Software Engineering at
Wuhan University, China, from October 30 to November 3. The main purpose
of the forum was to exchange the latest research results and development
experiences, and to discuss and establish standards for business objects
and software components. At the conclusion of the forum, Bargmeyer was
presented a document appointing him to be a guest professor at Wuhan
University for three years.


Find Out More about Spam Control at December 10 Brown Bag

The increased amount of spam that Lab users have been receiving has
generated considerable concern, as well as interest regarding how to
control it. Mark Rosenberg, manager of the Computing Infrastructure
Technologies Group, will give a presentation on how to control spam at
the next computer protection brown bag event at noon Tuesday, December
10. Bring your lunch and your questions.


Latest CS Job Postings and a Reminder about Rewards for Referrals

To help staff learn of potential professional opportunities, InTheLoop
is now featuring some of the top job openings in CS's three divisions.
Employees are also reminded that the Lab's Employee Referral Incentive
Program (ERIP) is still in effect and pays $1,000 (net) to employees who
refer successful candidates. For ERIP details, go to

This week's top CS job listings (and links to the postings) are:

Information Technologies and Services Division

Computing Infrastructure Support Dept.

Desktop Support Engineer: Perform desktop support, maintain Facilities
Web site, assist with deployment of Web applications built with JSP, BEA
Weblogic, and XML server technology, independently assess and resolve
user problems involving multiple desktop platforms and architectures,
and install, maintain and administer NT4 and Window 2000 servers. Read
more at http://www.lbl.gov/CS/Careers/OpenPositions/IT15443.html.

Information Systems and Services Department

Web Developer: As a member of Data Warehousing group, design, implement,
customize and maintain the Institutional Information Portal. Respond
quickly and appropriately to customer requests for upgrades and bug
fixes. Work closely with internal laboratory customers to define,
develop, and implement database web applications and enhancements. Read
more at http://www.lbl.gov/CS/Careers/OpenPositions/IT15358.html.

Information Systems and Services Department

Database Specialist: Provide technical support and expertise in the
Oracle Database Administration (DBA) group. Work will emphasize
performance monitoring and tuning at both the database and application
level.

Computational Research Division
Distributed Systems Department
Python XML Developer: Work as part of a team designing and developing
solutions for building distributed Grid middleware for the DOE Office of
Science community (http://www.er.doe.gov/). Current projects use Python,
XML, SOAP, TLS and the Grid Security Infrastructure (www.globus.org)
technologies. Future directions will include developing a full Open Grid
Services Architecture (http://www.globus.org/ogsa/) implementation in
Python, XML-Sec work, including security in SOAP messaging, and XML
digital signing.


Windows XP Security Course to Be Taught on December 13

Windows XP, the newest in the line of Microsoft Windows operating
systems, is now the standard for PCs here at Berkeley Lab. What kinds of
measures are necessary to make Windows XP run securely? You can find out
by attending the free half-day course on Windows XP security offered by
the Berkeley Lab Computer Protection Program. A description of the
course is at
Bring your XP laptop with you, if you have one.
Time and Date: 9 a.m. - 12:15 p.m., Friday, December 13
Location: Bldg. 50 auditorium
Instructors: Gene Schultz, Berkeley Lab, and Jason Judkins, Lawrence
Livermore National Laboratory
To enroll: Visit http://hris.lbl.gov/.

About Computing Sciences at Berkeley Lab

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.