Practical existence theorems for deep learning approximation in high dimensions

Published: 25 Mar 2025, Last Modified: 20 May 2025SampTA 2025 InvitedTalkEveryoneRevisionsBibTeXCC BY 4.0
Session: Sampling and learning of deep neural networks (Philipp Petersen)
Keywords: Deep learning, sampling theory, approximation theory, practical existence theorems
Abstract: Deep learning is having a profound impact on both industry and scientific research. While this paradigm continues to demonstrate impressive performance across a wide range of applications, its mathematical foundations remain insufficiently understood. Motivated by deep learning methods in scientific computing, I will illustrate the framework of practical existence theorems. These theorems aim to bridge the gap between theory and practice by combining constructive approximation results for deep neural networks with recovery guarantees from least squares and compressed sensing theory. They identify sufficient conditions on network architecture, training strategy, and training set size that guarantee a desired level of accuracy for a target function class. I will highlight recent advances in the field and demonstrate the application of practical existence theorems in high-dimensional function approximation, reduced-order modeling, and physics-informed machine learning.
Submission Number: 24
Loading