Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Learning Theory, Unlabeled Data, Kernel Methods, Semi-supervised Learning, Representation Learning, Label Propagation
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: STKR leverages unlabeled data by mixing the information from a kernel and data distribution via diffusion. We provide new STKR estimators applicable to the inductive setting, together with statistical guarantees and complexity analysis.
Abstract: Unlabeled data is a key component of modern machine learning. In general, the role
of unlabeled data is to impose a form of smoothness, usually from the similarity
information encoded in a base kernel, such as the ϵ-neighbor kernel or the adjacency
matrix of a graph. This work revisits the classical idea of spectrally transformed
kernel regression (STKR), and provides a new class of general and scalable STKR
estimators able to leverage unlabeled data. Intuitively, via spectral transformation,
STKR exploits the data distribution for which unlabeled data can provide additional
information. First, we show that STKR is a principled and general approach,
by characterizing a universal type of “target smoothness”, and proving that any
sufficiently smooth function can be learned by STKR. Second, we provide scalable
STKR implementations for the inductive setting and a general transformation
function, while prior work is mostly limited to the transductive setting. Third, we
derive statistical guarantees for two scenarios: STKR with a known polynomial
transformation, and STKR with kernel PCA when the transformation is unknown.
Overall, we believe that this work helps deepen our understanding of how to work
with unlabeled data, and its generality makes it easier to inspire new methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Primary Area: learning theory
Submission Number: 6707
Loading