Energy Efficient Computing
As we bump up against the limits of processor speed, memory and energy consumption, Berkeley Lab researchers are rethinking every aspect of scientific computing—from hardware and software, to algorithms, data center efficiency and networking. The aim isn't just reducing energy use, but to produce more science per Watt.
Scientists and engineers in Berkeley Lab's Computational Research, National Energy Research Scientific Computing, Information Technology, and Environmental Energy Technologies divisions are working together to solve a significant problem faced by computing centers worldwide: how to engineer, build and operate power-efficient computers and data centers. Their research examines a wide range of issues, from creating new computer architectures using low-power processors to innovative building designs. ESnet researchers are also exploring how to improve the energy efficiency of national networks.
Since the advent of parallel computing in the early 1990s, supercomputer performance advanced by adding more processors running at higher speeds to the system. Once the performance of processors leveled off around 2006, systems designers turned to packing each chip with more cores. That was enough to achieve petaflop/s-level performance, but extending this approach to the next step—exascale computing—would be so energy intensive that no center could afford the 200 megawatts needed to power such a system each year.
To get beyond the current limitations of chip performance and energy demands, entirely new architectures will be needed and the Berkeley Lab Computing Sciences organization is looking for solutions. It may help to view the exascale problem as one of continuing to improve computing performance, rather than focusing solely on how to build the biggest supercomputer that we can. It’s critical to start by creating the basic building blocks of such a system, but to create them in such a way that they can be assembled to scale to exaflops. This is no easy task, as the building blocks are systems of processors, memory system, storage systems, networking systems and more. Then there is the operating and software needed to make the system useful.
Such building blocks, could help address the big data problem businesses and organizations are beginning to wrestle with today.