Where Graph Meets Heterogeneity: Multi-View Collaborative Graph Experts

Published: 18 Sept 2025, Last Modified: 15 Jan 2026NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph neural networks, heterogeneous data, multi-view learning
Abstract: The convergence of graph learning and multi-view learning has propelled the emergence of multi-view graph neural networks (MGNNs), offering strong capabilities to address complex real-world data characterized by heterogeneous yet interconnected information. While existing MGNNs exploit the potential of multi-view graphs, the inherent conflict persists between the two critical inductive biases of multi-view learning, consistency and complementarity. Consequently, the challenge of defining and resolving this tension in the new context of multi-view graphs remains largely underexplored. To bridge this gap, we propose Multi-view Collaborative Graph Experts (MvCGE), a novel framework grounded in the Mixture-of-Experts (MoE) paradigm. MvCGE establishes architectural consistency through shared parameters while preserving complementarity via layer-wise collaborative graph experts, which are dynamically activated by a graph-aware routing mechanism that adapts to the structural nuances of each view. This dual-level design is further reinforced by two novel components: a load equilibrium loss to prevent expert collapse and ensure balanced specialization, and a graph discrepancy loss based on distributional divergence to enhance inter-view complementarity. Extensive experiments on diverse datasets demonstrate MvCGE’s superiority.
Supplementary Material: zip
Primary Area: Deep learning (e.g., architectures, generative models, optimization for deep networks, foundation models, LLMs)
Submission Number: 26683
Loading