Entropy-based convergence rates of greedy algorithmsDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 12 May 2023CoRR 2023Readers: Everyone
Abstract: We present convergence estimates of two types of greedy algorithms in terms of the metric entropy of underlying compact sets. In the first part, we measure the error of a standard greedy reduced basis method for parametric PDEs by the metric entropy of the solution manifold in Banach spaces. This contrasts with the classical analysis based on the Kolmogorov n-widths and enables us to obtain direct comparisons between the greedy algorithm error and the entropy numbers, where the multiplicative constants are explicit and simple. The entropy-based convergence estimate is sharp and improves upon the classical width-based analysis of reduced basis methods for elliptic model problems. In the second part, we derive a novel and simple convergence analysis of the classical orthogonal greedy algorithm for nonlinear dictionary approximation using the metric entropy of the symmetric convex hull of the dictionary. This also improves upon existing results by giving a direct comparison between the algorithm error and the metric entropy.
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview