|Abstract:|| : Real-life applications often require the optimization of nonlinear functions with several unknowns or parameters - where the function is the result of highly expensive and complex model simulations involving noisy data (such as climate or financial models, chemical experiments), or the output of a black-box or legacy code, that prevent the numerical analyst from looking inside to find out or calculate problem information such as derivatives. Thus classical optimization algorithms, that use derivatives (steepest descent, Newton's methods) often fail or are entirely inapplicable in this context. Efficient derivative-free optimization algorithms have been developed in the last 15 years in response to these imperative practical requirements. As even approximate derivatives may be unavailable, these methods must explore the landscape differently and more creatively. In state of the art techniques, clouds of points are generated judiciously and sporadically updated to capture local geometries as inexpensively as possible; local function models around these points are built using techniques from approximation theory and carefully optimised over a local neighbourhood (a trust region) to give a better solution estimate.
In this talk, I will describe our implementations and improvements to state-of-the-art methods. In the context of the ubiquitous data fitting/least-squares applications, we have developed a simplified approach that is as efficient as state of the art in terms of budget use, while achieving better scalability. Furthermore, we substantially improved the robustness of derivative-free methods in the presence of noisy evaluations. Theoretical guarantees of these methods will also be provided. Finally, despite derivative-free optimisation methods being able to only provably find local optima, we illustrate that, due to their construction and applicability, these methods can offer a practical alternative to global optimisation solvers, with improved scalability. This work is joint with Lindon Roberts (Oxford), Katya Scheinberg (Lehigh), Jan Fiala (NAG Ltd) and Benjamin Marteau (NAG Ltd).
|Note:|| Coralia Cartis (BSc Mathematics, Babesh-Bolyai University, Romania; PhD Mathematics, University of Cambridge) has joined the Mathematical Institute at Oxford University in 2013 as Associate Professor in numerical optimization. Previously, she worked as a research scientist in numerical analysis at Rutherford Appleton Laboratory, and assistant professor in the School of Mathematics, University of Edinburgh. Since 2016, she is also a Turing Fellow at the Alan Turing Institute (UK's national institute for data science), London. Her research addresses the development and analysis of algorithms for solving difficult optimization problems - nonconvex, with and without derivatives, local and global. She has also worked on compressed sensing and parameter estimation for climate modelling.|