Deep Learning meets Parametric Partial Differential Equations

A image

Keywords

Advanced Numerical Methods for Scientific Computing
Speaker:
Gitta Kutyniok
Affiliation:
Institute of Mathematics, Technische Universität Berlin (DE)
When:
Thursday 16th July 2020
Time:
14:00:00
Where:
Online seminar: https://mox.polimi.it/elenco-seminari/?id_evento=1977&t=763724
Abstract:
High-dimensional parametric partial differential equations (PDEs) appear in various contexts including control and optimization problems, inverse problems, risk assessment, and uncertainty quantification. In most such scenarios the set of all admissible solutions associated with the parameter space is inherently low dimensional. This fact forms the foundation for the reduced basis method. Recently, numerical experiments demonstrated the remarkable efficiency of using deep neural networks to solve parametric problems. In this talk, after an introduction into deep learning, we will present a theoretical justification for this class of approaches. More precisely, we will derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric PDEs. In fact, without any knowledge of its concrete shape, we use the inherent low-dimensionality of the solution manifold to obtain approximation rates which are significantly superior to those provided by classical approximation results. We use this low-dimensionality to guarantee the existence of a reduced basis. Then, for a large variety of parametric PDEs, we construct neural networks that yield approximations of the parametric maps not suffering from a curse of dimensionality and essentially only depending on the size of the reduced basis. Finally, we present a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of parametric partial differential equations. These experiments strongly support the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis.
Note:
Gitta Kutyniok currently holds an Einstein Chair in the Institute of Mathematics at the Technische Universität Berlin, a courtesy appointment in the Department of Computer Science and Engineering, an Adjunct Professorship in Machine Learning at the University of Tromso, and is also the head of the Applied Functional Analysis Group. She received her Diploma in Mathematics and Computer Science as well as her Ph.D. degree from the Universität Paderborn in Germany, and her Habilitation in Mathematics in 2006 at the Justus-Liebig Universität Gießen. From 2001 to 2008 she held visiting positions at several US institutions, including Princeton University, Stanford University, Yale University, Georgia Institute of Technology, and Washington University in St. Louis. In 2008, she became a full professor of mathematics at the Universität Osnabrück, and moved to Berlin three years later. She received various awards for her research such as an award from the Universität Paderborn in 2003, the Research Prize of Gießen and a Heisenberg-Fellowship in 2006, the von Kaven Prize by the DFG in 2007, and an Einstein Chair in 2008. She gave the Noether Lecture at the ÖMG-DMV Congress in 2013 and the Hans Schneider ILAS Lecture at IWOTA in 2016. She also became a member of the Berlin-Brandenburg Academy of Sciences and Humanities in 2017, a SIAM Fellow in 2019, and an IEEE Senior Member in the same year. She was Chair of the SIAM Activity Group on Imaging Sciences from 2018-2019 and is Co-Chair of the first SIAM conference on Mathematics of Data Science taking place this year. She is also, for instance, Scientific Director of the graduate school BIMoS at TU Berlin and Chair of the GAMM Activity Groups on Mathematical Signal- and Image Processing and Computational and Mathematical Methods in Data Science, as well as of the MATH+ Activity Group on Mathematics of Data Science. Her main research interests are in the areas of applied harmonic analysis, compressed sensing, high-dimensional data analysis, imaging science, inverse problems, machine learning, numerical mathematics, partial differential equations, and applications to life sciences and telecommunication.
PDF: