Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

Published: 09 Mar 2020, Last Modified: 02 Feb 2025COLT 2020EveryoneRevisionsCC BY 4.0
Abstract: We present and study approximate notions of dimensional and margin complexity, which correspond to the minimal dimension or norm of an embedding required to approximate, rather then exactly represent, a given hypothesis class. We show that such notions are not only sufficient for learning using linear predictors or a kernel, but unlike the exact variants, are also necessary. Thus they are better suited for discussing limitations of linear or kernel methods.
Loading