Towards Unsupervised Classification with Deep Generative Models

Dimitris Kalatzis, Konstantia Kotta, Ilias Kalamaras, Anastasios Vafeiadis, Andrew Rawstron, Dimitris Tzovaras, Kostas Stamatopoulos

Feb 15, 2018 (modified: Oct 27, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Deep generative models have advanced the state-of-the-art in semi-supervised classification, however their capacity for deriving useful discriminative features in a completely unsupervised fashion for classification in difficult real-world data sets, where adequate manifold separation is required has not been adequately explored. Most methods rely on defining a pipeline of deriving features via generative modeling and then applying clustering algorithms, separating the modeling and discriminative processes. We propose a deep hierarchical generative model which uses a mixture of discrete and continuous distributions to learn to effectively separate the different data manifolds and is trainable end-to-end. We show that by specifying the form of the discrete variable distribution we are imposing a specific structure on the model's latent representations. We test our model's discriminative performance on the task of CLL diagnosis against baselines from the field of computational FC, as well as the Variational Autoencoder literature.
  • TL;DR: Unsupervised classification via deep generative modeling with controllable feature learning evaluated in a difficult real world task
  • Keywords: variational inference, vae, variational autoencoders, generative modeling, representation learning, classification
0 Replies