Leveraging Hierarchical Similarities for Contrastive Clustering

Published: 01 Jan 2023, Last Modified: 05 Feb 2025ICONIP (8) 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, contrastive clustering has demonstrated high performance in the field of deep clustering due to its powerful feature extraction capabilities. However, existing contrastive clustering methods suffer from inter-class conflicts and often produce suboptimal clustering outcomes due to the disregard of latent class information. To address this issue, we propose a novel method called Contrastive learning using Hierarchical data similarities for Deep Clustering (CHDC), consisting of three modules, namely the inter-class separation enhancer, the intra-class compactness enhancer, and the clustering module. Specifically, to induct the latent class information by utilizing the sample pairs with data similarities, the inter-class separation enhancer and the intra-class compactness enhancer handle negative and positive sample pairs, respectively, with distinct hierarchical similarities. Additionally, the clustering module aims to ensure the alignment of cluster assignments between samples and their neighboring samples. By designing these three modules that work collaboratively, inter-class conflicts are alleviated, allowing CHDC to learn more discriminative features. Lastly, we design a novel update method for positive sample pairs to reduce the likelihood of introducing erroneous information. To evaluate the performance of CHDC, we conduct extensive experiments on five widely adopted image classification datasets. The experimental results demonstrate the superiority of CHDC compared to state-of-the-art methods. Moreover, ablation studies demonstrate the effectiveness of the proposed modules.
Loading