View Gap Matters: Cross-view Topology and Information Decoupling for Multi-view Clustering

Published: 20 Jul 2024, Last Modified: 21 Jul 2024MM2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Multi-view clustering, a pivotal technology in multimedia research, aims to leverage complementary information from diverse perspectives to enhance clustering performance. The current multi-view clustering methods normally enforce the reduction of distances between any pair of views, overlooking the heterogeneity between views, thereby sacrificing the diverse and valuable insights inherent in multi-view data. In this paper, we propose a Tree-Based View-Gap Maintaining Multi-View Clustering (TGM-MVC) method. Our approach introduces a novel conceptualization of multiple views as a graph structure. In this structure, each view corresponds to a node, with the view gap, calculated by the cosine distance between views, acting as the edge. Through graph pruning, we derive the minimum spanning tree of the views, reflecting the neighbouring relationships among them. Specifically, we applied a share-specific learning framework, and generate view trees for both view-shared and view-specific information. Concerning shared information, we only narrow the distance between adjacent views, while for specific information, we maintain the view gap between neighboring views. Theoretical analysis highlights the risks of eliminating the view gap, and comprehensive experiments validate the efficacy of our proposed TGM-MVC method.
Primary Subject Area: [Content] Multimodal Fusion
Relevance To Conference: In real world scenarios, we often face the challenge of learning from multiple media sources or making decisions by combining data from various sources. Multi-view learning stands as a significant branch of multimedia technology in addressing this challenge. Multi-view clustering (MVC) is a typical task in multi-view learning.
Supplementary Material: zip
Submission Number: 897
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview