Abstract: We propose a \textit{deep amortized clustering} (DAC), a neural architecture which learns to cluster datasets efficiently using a few forward passes. DAC implicitly learns what makes a cluster, how to group data points into clusters, and how to count the number of clusters in datasets. DAC is meta-learned using labelled datasets for training, a process distinct from traditional clustering algorithms which usually require hand-specified prior knowledge about cluster shapes/structures. We empirically show, on both synthetic and image data, that DAC can efficiently and accurately cluster new datasets coming from the same distribution used to generate training datasets.
Code: https://github.com/ICLR2020anonymous/dac
Keywords: clustering, amortized inference, meta learning, deep learning
Community Implementations: [ 2 code implementations](https://www.catalyzex.com/paper/deep-amortized-clustering/code)
Original Pdf: pdf
9 Replies
Loading