Variational Classification

Published: 07 Nov 2023, Last Modified: 13 Dec 2023M3L 2023 PosterEveryoneRevisionsBibTeX
Keywords: classification, neural networks, latent variable model, softmax
TL;DR: A probabilistic model that generalises a softmax classifier, providing interpretability and improving calibration, robustness and sample efficiency.
Abstract: We present *variational classification* (VC), a latent variable generalisation of neural network softmax classification under cross-entropy loss. Our approach provides a novel probabilistic interpretation of the highly familiar softmax classification model, to which it relates comparably to variational vs deterministic autoencoders. We derive a training objective based on the evidence lower bound (ELBO) that is non-trivial to optimize, and an adversarial approach to maximise it. We reveal an inherent inconsistency within softmax classification that VC addresses, while also allowing flexible choices of distributions in the latent space in place of assumptions implicit in standard softmax classifiers. Empirical evaluation demonstrates that VC maintains accuracy while improving properties such as calibration and adversarial robustness, particularly under distribution shift and low data settings. This work brings new theoretical insight to modern machine learning practice.
Submission Number: 57
Loading