Where Graph Meets Heterogeneity: Multi-View Collaborative Graph Experts

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph neural networks, heterogeneous data, multi-view learning
Abstract: The convergence of graph learning and multi-view learning has propelled the emergence of multi-view graph neural networks (MGNNs), offering unprecedented capabilities to address complex real-world data characterized by heterogeneous yet interconnected information. While existing MGNNs exploit the potential of multi-view graphs, they often fail to harmonize the dual inductive biases critical to multi-view learning: consistency (inherent inter-view agreement) and complementarity (view-specific distinctiveness). To bridge this gap, we propose Multi-view Collaborative Graph Experts (MvCGE), a novel framework grounded in the Mixture-of-Experts (MoE) paradigm. MvCGE establishes architectural consistency through shared parameters while preserving complementarity via layer-wise collaborative graph experts, which are dynamically activated by a graph-aware routing mechanism that adapts to the structural nuances of each view. This dual-level design is further reinforced by two novel components: a load equilibrium loss to prevent expert collapse and ensure balanced specialization, and a graph discrepancy loss based on distributional divergence to enhance inter-view complementarity. Extensive experiments on diverse datasets demonstrate MvCGE’s superiority.
Supplementary Material: zip
Primary Area: Deep learning (e.g., architectures, generative models, optimization for deep networks, foundation models, LLMs)
Submission Number: 26683
Loading