Stefan Wild, Director of Berkeley Lab’s Applied Mathematics and Computational Research (AMCR) Division, has been awarded the INFORMS Optimization Society’s 2023 Egon Balas Prize for his fundamental contributions to derivative-free optimization (DFO)—from algorithmic development and convergence theory to practical implementations and software tools. He will accept the award on Sunday, October 15, at the 2023 INFORMS Annual Meeting in Phoenix, Arizona. With over 12,500 members from around the globe, INFORMS is the leading international association for professionals in operations research, analytics, and management science.

In the citation, the award committee wrote they were impressed with “Wild’s fundamental contributions to optimization theory and algorithms with significant and impactful contributions to scientific computing and applications.” Named for renowned mathematician Egon Balas, this prize is given annually to an individual for a body of contributions in optimization. The award serves as recognition of the winner’s innovativeness and impact in optimization, including its theory, algorithms, and computations. 

Optimization is widely used in academia and industry to tackle complex design and decision-making problems. Mathematical optimization methods are used to identify possible solutions in a given situation and select the best one. 

Usually, numerical methods for solving optimization problems require the ability to compute derivatives of an objective or cost function for the available choices. In a manufacturing problem, the objective function could be the number of products a company can produce. The objective function can be maximized or minimized subject to satisfying constraints based on things such as labor, cost, resources, or time. Derivatives help researchers understand how the objective function changes with respect to certain decision variables, so they can be helpful for identifying combinations of decision variables for which the function increases or decreases or when an optimal solution has been found.

But there are times when derivatives can’t be calculated or estimated, such as when the objective depends on performing a physical or numerical experiment. Derivative-free optimization (DFO) is the mathematical study of optimization algorithms to address problems where derivatives are unavailable. 

Wild is internationally known for his work in DFO and has a long research record of problem-solving in optimization involving expensive computer simulations, large data sets, and physical experiments. He is the co-developer of data profiles, a popular tool for analyzing the performance of derivative-free optimization solvers when there are constraints on the computational budget. He has also developed computational noise estimation techniques and model-based methods for global, non-smooth, and stochastic DFO.

Wild received his Ph.D. in operations research from Cornell University in 2009. Before he came to Berkeley Lab, Wild was a senior computational mathematician and deputy division director of the Mathematics and Computer Science Division at Argonne National Laboratory. He has served on editorial boards of journals such as INFORMS Journal on Computing, Mathematical Programming Computation, Operations Research, SIAM Journal on Scientific Computing, and SIAM Review. Wild is also an adjunct faculty member in Industrial Engineering and Management Sciences and a senior fellow in NAISE at Northwestern University.

About Computing Sciences at Berkeley Lab

High performance computing plays a critical role in scientific discovery. Researchers increasingly rely on advances in computer science, mathematics, computational science, data science, and large-scale computing and networking to increase our understanding of ourselves, our planet, and our universe. Berkeley Lab's Computing Sciences Area researches, develops, and deploys new foundations, tools, and technologies to meet these needs and to advance research across a broad range of scientific disciplines.