Online approximate factorization of a kernel matrix by a Hebbian neural networkDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: online kernel methods, hebbian learning, similarity matching
Abstract: We derive an online algorithm for unsupervised learning based on representing every input $\mathbf{x}_t$ by a high dimensional vector $\mathbf{y}_t$ with pairwise inner products that approximately match input similarities as measured by a kernel function: $\mathbf{y}_s \cdot \mathbf{y}_{t} \approx f(\mathbf{x}_s, \mathbf{x}_{t})$. The approximation is formulated using the objective function for classical multidimensional scaling. We derive an upper bound for this objective which only involves correlations between output vectors and nonlinear functions of input vectors. Minimizing this upper bound leads to a minimax optimization, which can be solved via stochastic gradient descent-ascent. This online algorithm can be interpreted as a recurrent neural network with Hebbian and anti-Hebbian connections, generalizing previous work on linear similarity matching. Through numerical experiments with two datasets, we demonstrate that unsupervised learning can be aided by the nonlinearity inherent in our kernel method. We also show that heavy-tailed representation vectors emerge from the learning even though no sparseness prior is used, lending further biological plausibility to the model. Our upper bound employs a rank-one Nystrom approximation to the kernel function, with the novelty of leading to an online algorithm that optimizes landmark placement.
One-sentence Summary: An online algorithm for factorizing a kernel matrix, implemented by a neural network with Hebbian and anti-Hebbian plasticity
Supplementary Material: zip
16 Replies

Loading