Provably Strict Generalisation Benefit for Invariance in Kernel MethodsDownload PDF

21 May 2021, 20:50 (modified: 25 Oct 2021, 19:21)NeurIPS 2021 PosterReaders: Everyone
Keywords: generalization, kernel methods, invariance, equivariance, symmetry, geometric deep learning, statistical learning theory
TL;DR: Strict generalisation benefit for invariance in kernel ridge regression
Abstract: It is a commonly held belief that enforcing invariance improves generalisation. Although this approach enjoys widespread popularity, it is only very recently that a rigorous theoretical demonstration of this benefit has been established. In this work we build on the function space perspective of Elesedy and Zaidi [8] to derive a strictly non-zero generalisation benefit of incorporating invariance in kernel ridge regression when the target is invariant to the action of a compact group. We study invariance enforced by feature averaging and find that generalisation is governed by a notion of effective dimension that arises from the interplay between the kernel and the group. In building towards this result, we find that the action of the group induces an orthogonal decomposition of both the reproducing kernel Hilbert space and its kernel, which may be of interest in its own right.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
12 Replies