Topological Neural Discrete Representation Learning à la Kohonen

Published: 20 Jun 2023, Last Modified: 11 Oct 2023SODS 2023 OralEveryoneRevisionsBibTeX
Keywords: Self-Organising Maps, Kohonen Maps, Vector Quantization, Variational Autoencoders, VQ-VAEs
TL;DR: We use the learning rule of Kohonen Self-Organising Maps as the vector quantisation algorithm for VQ-VAEs
Abstract: Unsupervised learning of discrete representations in neural networks (NNs) from continuous ones is essential for many modern applications. Vector Quantisation (VQ) has become popular for this, in particular in the context of generative models such as Variational Auto-Encoders (VAEs), where the exponential moving average-based VQ (EMA-VQ) algorithm is often used. Here we study an alternative VQ algorithm based on Kohonen's learning rule for the Self-Organising Map (KSOM; 1982), a classic VQ algorithm known to offer two potential benefits over its special case EMA-VQ: empirically, KSOM converges faster than EMA-VQ, and KSOM-generated discrete representations form a topological structure on the grid whose nodes are the discrete symbols, resulting in an artificial version of the brain's topographic map. We revisit these properties by using KSOM in VQ-VAEs for image processing. In our experiments, the speed-up compared to well-configured EMA-VQ is only observable at the beginning of training, but KSOM is generally much more robust, e.g., w.r.t. the choice of initialisation schemes.
Submission Number: 24
Loading