Special Feature: Five Questions For Kathy Yelick
September 23, 2013
As the Associate Laboratory Director for Computing Sciences at Lawrence Berkeley National Laboratory, a professor of Electrical Engineering and Computer Sciences at the University of California at Berkeley and author of more than 100 peer-reviewed research papers, Kathy Yelick is internationally recognized as a leader in high performance computing.
In recognition of her leadership, she received the Association for Computing Machinery’s 2013 Athena Award, for which she will give an invited talk on Nov. 21at the SC13 conference in Denver. She was Director of the National Energy Research Scientific Computing Center (NERSC) from 2008 to 2012 and she co-invented the UPC and Titanium languages as well as techniques for self-tuning sparse matrix kernels. She earned her Ph.D. in EECS from MIT and has been a professor at UC Berkeley since 1991 with a joint appointment at LBNL since 1996. She has received multiple teaching and research awards. She also serves on may review and planning committees, and is a member of the California Council on Science and Technology and the National Academies Computer Science and Telecommunications Board.
In conjunction with the U.S. Department of Energy’s focus on supercomputing in the month of September, Yelick answers five questions about what intrigues her about supercomputers, how they can help save our world, and why more students aren’t going into computing as a career.
Q: When someone you meet for the first time asks what you do for work, what do you tell them?
KY: That was an easier question when my main job was as a professor of computer science at UC Berkeley. I still supervise students, but my job as Associate Laboratory Director for Computing Sciences is my primary responsibility now. In this position, I manage an organization of about 350 scientists and engineers. Our combined effort is aimed at developing computational solutions to major scientific problems and giving the rest of nation’s research community the tools and facilities to do the same. Scientists use supercomputers to create complex simulations of problems that are otherwise too big, too small, too fast or too slow to study any other way. They also use supercomputers to analyze and understand massive datasets created at large-scale experiments, such as particle accelerators or genome sequencing facilities.
To help the scientific community do this, we operate one of the world’s most advanced supercomputing centers, called NERSC, where powerful computers and highly trained staff work to solve challenges like understanding climate change, developing renewable fuels, creating better batteries, studying biological and chemical mechanisms, and discovering the secrets of the universe.
Supercomputers are made up of thousands of processors, similar to what you might run in your personal laptop, but the software is much more complicated because the processors have to communicate with one other to solve complex problems. For example, in climate modeling, each processor may work on a different region of the earth, but it will have to send messages to other processors working on neighboring regions. To write such programs, one needs to understand something about the science area, the underlying mathematics, and how the computer systems behave, so we often have teams of experts working together on them.
We also run a high-speed network called ESnet, which has a 100 gigabits-per-second network backbone that creates a kind of information highway for large scientific data sets. ESnet is the Department of Energy’s dedicated network serving tens of thousands of scientists at DOE research sites and universities and providing high-speed connections to their collaborators around the world.
Q: Within the computer science community, there is a lot of discussion about why women are not entering the field. There is a broader issue that women are under-represented in the STEM (Science, Technology, Engineering and Math) fields. What are your thoughts on this?
KY: I wish there were a simple answer to this question, but I think multiple factors contribute to the gender imbalance in many STEM fields, including computer science. We know there are many smart, highly motivated young women, who go into fields like medicine, biology, and law, but are not as well represented in other STEM fields. I think a real issue is lack of understanding about what scientists in disciplines like physics, chemistry or computing actually do; there are few examples in the popular media and the fields may seem too abstract to young students. Why this affects women or some ethnic groups disproportionately, I can’t say, but I think programs that help explain what we do, and how it is used to solve important societal problems are at least a piece of the solution. At UC Berkeley, the Department of Electrical Engineering and Computing Sciences has developed an introductory class called “The Beauty and Joy of Computing” to go beyond the mechanics of computing and give students a taste of the kind of creativity, problem solving skills, and teamwork that is involved in writing software. Hopefully, classes like these will encourage people from many different backgrounds to study computer science.
Q: What spurred your interest in computers, especially supercomputers?
KY: When I was a freshman at MIT, I took a computer science class because people said it would be good for me—after all, everyone should know something about computing. But I found the class fascinating, and that I really loved the process of writing programs. Programming is really about solving a puzzle of how do you tell a computer to do something – you have to be very precise as computers can’t interpret vague commands, but there is also an aesthetics to writing software, because you want the programs to be organized and easy for others to read and understand. So, I took another computer science class and learned how to make computers solve more sophisticated problems, from sending e-mail to solving theorems. And along the way, I earned my bachelor’s, master’s and doctorate degrees all in electrical engineering and computer science.
Looking for a new challenge, I turned my attention to parallel programming, in which many processors have to coordinate to solve a bigger problem. I joined UC Berkeley 1991, where there were a number of other professors working on parallel hardware, software, and mathematics. Much of my research has been focused on designing easier ways of programming parallel machines, but as I became more involved at Berkeley lab, my work has shifted more towards the question of what you can do with computers, rather than how to do it.
Q: You have a presentation for high school students called “Saving the world with computing.” How do you see that happening?
KY: Supercomputing is being used to study two of the most significant challenges society is facing. The first is what I call our changing world, and this includes understanding climate change, developing alternative energy sources, environmental mitigation techniques, and so on. The second broad area is health and medicine, where supercomputers are increasing our understanding of the human body, learning why some people are more susceptible to certain diseases, helping develop better treatments and preventing diseases. As we deploy more powerful computers, we are able to study these problems in ever-increasing detail and that can help us find the answers sooner.
With regard to climate change, supercomputers are helping us develop and run very detailed models of how our climate has changed over the centuries and what will happen in the years to come. We can also adjust these models to test different scenarios, such as reducing our greenhouse gases, to see what will happen. We are also studying how climate change will increase the number and severity of extreme weather events, such as hurricanes, tornados and superstorms. We can also use modeling and simulation to help develop new ways to reduce the output of greenhouse gases, whether through using renewable energy or safely storing carbon dioxide underground – before it reaches the atmosphere.
In the area of health and medicine, the long-term vision is to create a digital human, a 3D image-based medical record that is a “digital body double” of each of us. This image would allow doctors to diagnose problems, perform less-invasive surgery and test experimental treatments. My group use one of our new programming languages to create a simulation of a beating human heart, showing how the language helped to make the program simpler and easier adapt to other problems.
In addition to solving these critical problems, there are always new computing problems to be studied. And, you also get to meet and work with a lot of great people—I feel very lucky to be able to work on something I enjoy and always have opportunities to learn new areas.
Q: Where do you see the supercomputing field in five years?
KY: I think the scientific process is about to undergo a transformation, in the same way that the Internet has combined with web content and search engines to revolutionize every aspect of our lives. Far from the old model of the individual scientist working alone in a laboratory, scientists will be able to combine their own data with that of other scientists, re-using and re-analyzing data, and making use of sophisticated mathematical analyses to help discover relationships across data sets. Scientists already work in interdisciplinary teams, but in the future they will be able to easily search for and find scientific results from other teams, sometimes with surprising results. We are looking for ways to provide the computing systems and services to help our users get the science out of their data, and as with any other complex problem, we need to develop the underlying mathematics and better ways of expressing those mathematical relationships as programs.
The kinds of scientific questions we are now asking, whether it is about data collected from experiments or simulations of a theory, will require more powerful computers. In some sense, the easy science questions have been answered, and we are now trying to understand how various physical systems work together to produce a functioning human body or explain the images we see of the universe. Answering these questions will require in an entirely new architecture for supercomputers, because the performance of individual processors has plateaued, so instead we are now squeezing more and more processor cores onto every chip. We used to think of computation as being expensive, but today we worry as much about moving data around the computer, both between the processor and memory and between processors in a parallel computer. While there is no consensus yet on what such a next-generation supercomputer will look like, but we do know it will look quite different than it does today.
About Computing Sciences at Berkeley Lab
The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.
ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation.
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States.