Representation Collapsing Problems in Vector Quantization

Published: 12 Oct 2024, Last Modified: 14 Nov 2024SafeGenAi PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Vector Quantization, VQ-VAE, Discrete Representation Learning, Representation Collapse
Abstract: Vector quantization is a technique in machine learning that discretizes continuous representations into a set of discrete vectors. It is widely employed in tokenizing data representations for large language models, diffusion models, and other generative models. Despite its prevalence, the characteristics and behaviors of vector quantization in generative models remain largely underexplored. In this study, we systematically investigate the issue of collapses in vector quantization, where collapsed representations are observed across discrete codebook tokens and continuous latent embeddings. By leveraging both synthetic and real datasets, we identify the severity of each type of collapses and the conditions leading to these collapses, as well as analyze their underlying causes. Accordingly, we propose potential solutions aimed at mitigating these collapses. To the best of our knowledge, this is the first comprehensive study examining representation collapsing problems in vector quantization.
Submission Number: 226
Loading