Supplementary Material: pdf
Keywords: classification, neural networks, latent variable model, softmax
TL;DR: A probabilistic model that generalises a softmax classifier, providing interpretability and improving calibration, robustness and sample efficiency.
Abstract: We present *variational classification* (VC), a latent variable generalisation of neural network softmax classification under cross-entropy loss. Our approach provides a novel probabilistic interpretation of the highly familiar softmax classification model, to which it relates comparably to variational vs deterministic autoencoders. We derive a training objective based on the evidence lower bound (ELBO) that is non-trivial to optimize, and an adversarial approach to maximise it. We reveal an inherent inconsistency within softmax classification that VC addresses, while also allowing flexible choices of distributions in the latent space in place of assumptions implicit in standard softmax classifiers. Empirical evaluation demonstrates that VC maintains accuracy while improving properties such as calibration and adversarial robustness, particularly under distribution shift and low data settings. By explicitly considering representations learned by supervised methods, we offer the prospect of the principled merging of supervised learning with other representation learning methods, e.g.\ contrastive learning, using a common encoder architecture.
Track: Extended Abstract Track
Submission Number: 20
Loading