Estimation error of gradient descent in deep regressions

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: pdf
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: deep regression, gradient descent, estimation error, approximation, generalization, optimization
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We establish consistency between the outputs of gradient descent and the true regression function in the over-parameterized scenario.
Abstract: To achieve a theoretical understanding of deep learning, it is necessary to consider the approximation, generalization, and optimization errors. In recent years, there have been significant advancements in the literature regarding each or two of these errors. However, there have been few works that simultaneously analyze all three errors. This is due to the gap that exists between the optimization and generalization errors in over-parameterized regimes. In this work, we attempt to bridge this gap by establishing consistency between the outputs of gradient descent and the true regression function in the over-parameterized scenario. Our research offers a feasible perspective for a more comprehensive understanding of the theory behind deep learning.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9062
Loading