Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach

ICLR 2026 Conference Submission22168 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, Graph Filters, Mixture of Experts
Abstract: Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns. Most GNNs employ a uniform global filter—typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs. However, real-world graphs often exhibit a complex mix of homophilic and heterophilic patterns, rendering a single global filter approach suboptimal. While few methods have introduced multiple global filters, they often apply these filters uniformly across all nodes, which may not effectively capture the diverse structural patterns present in real-world graphs. In this work, we theoretically demonstrate that a global filter optimized for one pattern can adversely affect performance on nodes with differing patterns. To address this, we introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes. Extensive experiments demonstrate the effectiveness of Node-MoE on both homophilic and heterophilic graphs.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 22168
Loading