Dynamic Clustering and Cluster Contrastive Learning for Unsupervised Person Re-Id With Feature Distribution Alignment

Published: 01 Jan 2024, Last Modified: 13 Nov 2024ICASSP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Unsupervised Re-ID methods aim at learning robust and discriminative features from unlabeled data. However, existing methods often ignore the noise from distribution discrepancy during network training, which may lead to feature misalignment and hinder the model performance. To address this problem, we propose a Dynamic Clustering and Cluster Contrastive Learning (DCCC) method. Specifically, we first design a Dynamic Clustering Parameters Scheduler (DCPS) which adjust the clustering algorithm to fit the variation of feature distances to alleviate the distribution noise caused by unreasonable hyper-parameter settings in a global aspect. Then, a Dynamic Cluster Contrastive Learning (DyCL) method is proposed to tackle the distribution discrepancy in batch training with re-weighting allocation in a local aspect. We also introduce a Label Smoothing Soft Contrastive Loss (L ss ) to combine the DyCL loss and self-supervised loss with low consumption and high efficiency on computing. Experiments on several public datasets validate the effectiveness of our proposed DCCC which outperforms previous state-of-the-art methods by achieving the best performance. Code is available at https://github.com/theziqi/DCCC.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview