Tensor Network-Constrained Kernel Machines as Gaussian Processes

Published: 22 Jan 2025, Last Modified: 25 Feb 2025AISTATS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: n this paper we establish a new connection between Tensor Network-constrained kernel machines and Gaussian Processes.
Abstract: In this paper we establish a new connection between Tensor Network (TN)-constrained kernel machines and Gaussian Processes (GPs). We prove that the outputs of Canonical Polyadic Decomposition (CPD) and Tensor Train (TT)-constrained kernel machines converge in the limit of large ranks to the same GP which we fully characterize, when specifying appropriate i.i.d. priors across their components. We show that TT-constrained models achieve faster convergence to the GP compared to their CPD counterparts for the same number of model parameters. The convergence to the GP occurs as the ranks tend to infinity, as opposed to the standard approach which introduces TNs as an additional constraint on the posterior. This implies that the newly established priors allow the models to learn features more freely as they necessitate infinitely more parameters to converge to a GP, which is characterized by a fixed learning representation and thus no feature learning. As a consequence, the newly derived priors yield more flexible models which can better fit the data, albeit at increased risk of overfitting. We demonstrate these considerations by means of two numerical experiments.
Submission Number: 721
Loading