A-Z Index | Directory | Careers

Berkeley Lab Hosts National Academy Symposium on Future of Supercomputers

June 13, 2001

About three dozen members of the prestigious National Academy of Engineering visited Berkeley Lab on June 8 to attend a symposium entitled “Do Supercomputers Have a Future?”

The invitation-only symposium featured experts from the fields of supercomputer manufacturing, design and utilization – Burton Smith, chief scientist of Cray Inc., David Patterson, professor of computer science at UC Berkeley, and Bill McCurdy, head of Berkeley Lab Computing Sciences and a chemical physicist who computes on supercomputers. The afternoon concluded with a panel discussion, in which the three speakers were joined by Ed Oliver, head of DOE’s Office of Advanced Scientific Computing Research, and Horst Simon, director of the Lab’s National Energy Research Scientific Computing Division.

Oliver, noting that Office of Science researchers are “truly insatiable for cycles,” said that while the public may think of defense applications when they think of supercomputing, in 20 years the Office of Science will be “the dominant force in high-end computing in DOE.”

The three top issues regarding the future of supercomputing, Oliver said, are “Can they build it? Does it work? Can scientists and engineers use it effectively?”

Smith, who founded Tera Computer Co. to develop and build a supercomputer based on a radically different architecture known as Multithreaded Architecture, addressed Oliver’s first two points in his presentation, “What Happened to Supercomputers?” Smith said that today’s supercomputers are really multicomputers, or clusters of computers. In the past, supercomputers referred to specially designed and built systems, such as those built by Cray in the 1970s and ’80s.

Those supercomputers consisted of custom vector processors which were very fast for their time, but have lost ground to systems which use hundreds or thousands of commodity processors to run jobs in parallel segments. This parallel architecture, however, poses new challenges in accessing memory. In fact, although today’s parallel supercomputers calculate problems at speeds of trillions of calculations per second, the memory bandwidth is too small and response time is too long to keep up with the arithmetic functions. This problem of slow response time, called latency, actually affects most computers today.

Although many reasons, ranging from the end of the Cold War to decreasing costs of computer chips to lack of funding for computer research, have been offered to explain the decline of supercomputer manufactures, Smith said he thinks the real problem is the imbalance between bandwidth and computing speeds.

The problem, though, is not so much a technological one as it is a political and social one, he said. Because computers are so fast, most people think that computer architecture is a “dead topic” and there is no political will to address the problem, even though it if fundamental to the future of research in biology, fluid dynamics, physics, chemistry and other disciplines, Smith said.

“We have a clear need for a spectrum of computing,” he said. “In supercomputing, there is a deficiency and we should fill it.”

Speaking from the scientist’s perspective, McCurdy told the audience that when parallel computers were first deployed, the systems presented scientists with a seemingly insurmountable barrier by requiring codes to be rewritten in parallel. However, scientists in all disciplines have overcome this hurdle and have achieved dramatic breakthroughs using parallel computers.

Now that the research community has made the paradigm shift to parallel computing, more powerful computers are needed to further advance research in areas such as genomics, combustion and materials science, McCurdy said. Climate research, for example, needs from 10 to 40 teraflop/s (trillions of calculations per second) sustained computing speed to develop accurate models. By comparison, NERSC has recently installed a 3.8 teraflop/s IBM SP supercomputer, which is the world’s most powerful computer for unclassified research.

The dilemma for scientists, McCurdy said, is now that they are taking advantage of these commodity-based systems, scaling up to even larger computers will be limited by the problems of communication between processors. With the billion-dollar-a-year market for scientific computing systems dwarfed by that for web servers, the emphasis continues to be on making faster processors, not increasing communication bandwidth or reducing latency, McCurdy said. The solution to this communication problem may come from other technology, such as the optical networking being developed for faster Internet communications.

Patterson, who led the design and implementation of the Reduced instruction set computer (RISC), a computer that provides only a few simple instructions but executes them extremely quickly, said that another technology that could be adapted for supercomputers may already be at hand. As the next step, Patterson suggested adopting the technology of embedded computing, such as that used in cell phones, for future supercomputers. Such systems use low power and have the memory integrated with the processor. Using such chips for supercomputers, though, would require very special designs – unlike the current practice of using commodity processors.

In opening the symposium, Lab Director Charles Shank said it was “a privilege to host this event.” Members of the National Academy of Engineering, part of the National Academy of Sciences, are elected to the academy, which prepares hundreds of advisory reports annually on issues of importance to the nation.


About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab’s Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.