Deep Continuous ClusteringDownload PDF

15 Feb 2018 (modified: 22 Oct 2023)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: Clustering high-dimensional datasets is hard because interpoint distances become less informative in high-dimensional spaces. We present a clustering algorithm that performs nonlinear dimensionality reduction and clustering jointly. The data is embedded into a lower-dimensional space by a deep autoencoder. The autoencoder is optimized as part of the clustering process. The resulting network produces clustered data. The presented approach does not rely on prior knowledge of the number of ground-truth clusters. Joint nonlinear dimensionality reduction and clustering are formulated as optimization of a global continuous objective. We thus avoid discrete reconfigurations of the objective that characterize prior clustering algorithms. Experiments on datasets from multiple domains demonstrate that the presented algorithm outperforms state-of-the-art clustering schemes, including recent methods that use deep networks.
TL;DR: A clustering algorithm that performs joint nonlinear dimensionality reduction and clustering by optimizing a global continuous objective.
Keywords: clustering, dimensionality reduction
Code: [![github](/images/github_icon.svg) shahsohil/DCC](https://github.com/shahsohil/DCC) + [![Papers with Code](/images/pwc_icon.svg) 2 community implementations](https://paperswithcode.com/paper/?openreview=SJzMATlAZ)
Data: [MNIST](https://paperswithcode.com/dataset/mnist), [RCV1](https://paperswithcode.com/dataset/rcv1)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/arxiv:1803.01449/code)
14 Replies

Loading