Understanding the Approximation Gap of Neural Networks

20 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Neural networks, function approximation, classification, scientific computing
TL;DR: We aim to discuss the factors that prevent neural networks from achieving low errors in scientific computing problems.
Abstract: Neural networks have gained popularity in scientific computing in recent years. However, they often fail to achieve the same level of accuracy as classical methods, even on the simplest problems. As this appears to contradict the universal approximation theorem, we seek to understand neural network approximation from a different perspective: their approximation capability can be explained by the non-compactness of their image sets, which, in turn, influences the existence of a global minimum, especially when the target function is discontinuous. Furthermore, we demonstrate that in the presence of machine precision, the minimum achievable error of neural networks depends on the grid size, even when the theoretical infimum is zero. Finally, we draw on the classification theory and discuss the roles of width and depth in classifying labeled data points, explaining why neural networks also fail to approximate smooth target functions with complex level sets, and increasing the depth alone is not enough to solve it. Numerical experiments are presented in support of our theoretical claims.
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2898
Loading