Quantum Information Sciences
For more than 50 years, Moore’s Law reigned supreme. The observation that the number of transistors on a computer chip doubled roughly every two years set the pace for our modern digital revolution, making smartphones, personal computers and current supercomputers possible. But Moore’s Law is slowing. And even if it wasn’t, some of the big problems scientists need to tackle might be beyond the reach of conventional computers.
Researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) are exploring a fundamentally different kind of computing architecture based on quantum mechanics to solve some of science’s hardest problems.
With funding from the Department of Energy’s (DOE) Office of Science and Berkeley Lab’s Laboratory Directed Research and Development (LDRD) program, our researchers are breaking new ground in several areas, including:
The key to building quantum computers that solve scientific problems beyond the reach of conventional computers is “quantum coherence.” This phenomenon allows quantum systems to store much more information per bit than is possible with traditional computers. So whereas a conventional computer bit encodes information as either 0 or 1, a qubit (quantum bit) can be 0, 1 or a superposition of states (both 0 and 1 at the same time). If these qubits could be linked or entangled in a quantum computer, the system would be able to tackle complex challenges too intractable for today’s classic computers and complete calculations in a fraction of the time.
But getting qubits to this state of quantum coherence —where they can take advantage of quantum mechanical properties and then make the most of them when they are in this state—remains a challenge. Thus Berkeley Lab is leading the DOE’s Advanced Quantum-Enabled Simulation (AQuES) Testbed project, which aims to examine the behavior of near-term quantum devices for computation. Within the next five years, Berkeley Lab aims to answer a host of questions related to the utility of near-term superconducting qubit devices for simulations of relevance to the DOE mission.
The AQuES testbed is centered around a quantum core comprised of an superconducting qubit computing platform, supported by modular classical control and logic, and a flexible software stack consisting of low-level programming, validation, verification and benchmarking tools. The project's focus is on: (i) quantum and classical hardware scaling and optimization, including the delineation and exploration of the available co-design tradespace for near-term quantum computing algorithms for simulation, (ii) robust and portable characterization methods for quantum verification and validation, including platform-independent metrics to quantify the computational power of quantum simulators, and (iii) scalable hardware and software interfaces. This project is a collaboration between Berkeley Lab and Lawrence Livermore National Laboratory (LLNL), Berkeley Lab will focus on transmon arrays and LLNL will focus on multimode cavity arrays.
To develop their qubit devices, the AQuES team is collaborating with UC Berkeley’s Quantum Nanoelectronics Laboratory, already a leader in enabling scalable and high-fidelity quantum hardware resources. They will also leveraging fabrication techniques developed at DOE’s Molecular Foundry, as well as the classical computing resources at DOE’s National Energy Research Scientific Computing Center (NERSC) to simulate and create their quantum hardware. Both DOE facilities are located at Berkeley Lab.
Additionally, the AQuES team is relying on custom classical computing and control developments from Berkeley Lab’s Computational Research, Accelerator Technologies & Applied Physics and Engineering divisions to accelerate the development of these devices and enhance their quality and scalability.
Many of these relationships and collaborations stemmed from an investment in the development of prototype superconducting quantum processors made by Berkeley Lab’s LDRD program several years ago. Berkeley Lab researchers proved the viability of the hardware by successfully calculating the complete energy spectrum of a hydrogen molecule with quantum algorithms developed by another LDRD-funded project. This co-design framework is a hallmark of Berkeley Lab’s Quantum Information Sciences research infrastructure.
Someday, universal quantum computers will be able to solve a wide range of problems, from molecular design to machine learning and cybersecurity, but we’re still a long way from that at this point. So the current question is whether there are specific problems that researchers can tackle in the near term with more specialized quantum computers.
Several years ago, Berkeley Lab’s LDRD program invested in research to develop quantum chemistry and optimization algorithms. Those researchers demonstrated the possibilities for quantum computing in chemistry by using their algorithms to successfully solve the complete energy spectrum of a hydrogen molecule. To further explore the potential for quantum computing for quantum chemistry, the DOE created the Quantum Algorithms Team (QAT4Chem). Led by Berkeley Lab, this endeavor brings together algorithm developers, computer scientists, applied mathematicians and quantum hardware platform developers to optimally design, create and run novel algorithms that will advance scientific discovery in chemical sciences. The team consists of researchers from Argonne National Laboratory, Berkeley Lab, Harvard and UC Berkeley.
“We are in the early stages of quantum computing, kind of like where we were with conventional computing in the 1940s. We have some of the hardware; now we need to develop a robust set of software, algorithms and tools to optimally utilize it to solve really hard science problems,” -- Bert de Jong, leader of DOE’s Quantum Algorithms Team project.
The QAT4Chem team is focusing on the development of new classes of algorithms that will, for the first time, be able to capture time dynamics of physical systems on near-term quantum devices. They are also aiming to advance the rapidly growing field of quantum machine learning by developing a quantum autoencoder that can compress quantum data into a subspace and then decompress it and apply it to scientific problems relevant in chemical sciences.
They are also working on stochastic optimization algorithms that lead to accelerated convergence of the hybrid quantum-classical algorithm; the goal here is to exploit this algorithm to minimize the number of consecutive operations performed on a quantum device. They will also explore quantum linear algebra solvers targeting their quantum algorithms on time dynamics and machine learning.
The promise of quantum computing for science is clear, but scientific achievements would still not be possible without the ability to read the system’s outputs. With funding from DOE’s AQuES testbed, researchers in Berkeley Lab’s Accelerator Controls and Instrumentation (BACI) Program are exploring the use of field-programmable gate arrays (FPGAs) to read the microwave-sensed outputs in quantum computing.
While quantum hardware is rapidly advancing, the development of an end-user software stack is still in its infancy, and the development of an executable code for quantum hardware is currently an arduous and unsustainable task. With funding from AQuES and QAT4Chem, Berkeley Lab researchers are engaging with many levels of the software stack, from application-specific abstractions to the low-level languages that interface with quantum hardware.
They are also developing efficient compiling and optimization techniques and software tools, within open-source frameworks, that provide an effective implementation and execution of their algorithms. Near-term devices are noisy and application-specific error mitigation techniques can be developed to improve results. Additionally, Berkeley Lab researchers are working on software to optimize the order that program instructions are executed, taking account of qubit constraints such as connectivity, and gate-execution times.
To ensure success, the Berkeley Lab-led projects are taking a co-design approach, designing their programming models and languages with evolving quantum hardware and theory. And to fully exploit all quantum computing capabilities, lab experts are becoming an integral part of open-source efforts in academia and industry.
About Computing Sciences at Berkeley Lab
The Computing Sciences Area at Lawrence Berkeley National Laboratory(Berkeley Lab) provides the computing and networking resources and expertise critical to advancing Department of Energy Office of Science (DOE-SC) research missions: developing new energy sources, improving energy efficiency, developing new materials, and increasing our understanding of ourselves, our world, and our universe. ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 7,000-plus scientists at national laboratories and universities. NERSC and ESnet are both Department of Energy Office of Science National User Facilities. The Computational Research Division (CRD) conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation.
Berkeley Lab addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science. The DOE Office of Science is the United States' single largest supporter of basic research in the physical sciences and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.