ADAPTIVEMIXGNN: Local Adaptive Inductive Bias for Heterophilic Node Classification

Published: 02 Mar 2026, Last Modified: 11 Mar 2026ICLR 2026 Workshop GRaM PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: tiny paper (up to 4 pages)
Keywords: Graph Neural Networks, Heterophily, Spectral Filtering, Local Adaptivity, Permutation Equivariance, O(∣E∣) Complexity
TL;DR: AdaptiveMixGNN learns node-wise adaptive mixing of low-pass and high-pass filters, achieving state-of-the-art heterophilic node classification with minimalist O(∣E∣) complexity.
Abstract: Most GNNs apply a uniform global filter to every node, implicitly assuming one dominant structural regime. Real graphs violate this assumption: homophilic and heterophilic patterns coexist and vary locally. We introduce AdaptiveMixGNN, a first-order spectral GNN that preserves scale and simplicity by learning a per-node mixing between low-pass and high-pass shifts: $$ S_{\alpha} = \mathrm{diag}(\alpha)\, S_{\mathrm{LP}} + (I - \mathrm{diag}(\alpha))\, S_{\mathrm{HP}}, $$ with $$ \alpha_i = \sigma(h_i^{\top}\theta + b). $$ This adds only d+1 parameters per layer and keeps O(L · |E|) complexity, comparable to GCNs. On heterophilic benchmarks, AdaptiveMixGNN reaches 79.46% accuracy on Texas and 79.61% on Wisconsin, outperforming the polynomial filters we evaluated (K ≥ 10) while avoiding their overfitting pathologies on small graphs. Ablation studies show that node-wise adaptivity acts as an insurance policy against catastrophic failures of fixed filters, with gains of up to +10.59% over the best static baseline. Finally, a per-node homophily analysis links the learned α values to local label structure (Texas: mean homophily 0.033 versus 0.247 for correct versus incorrect nodes), suggesting that the model discovers a meaningful local frequency response.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 129
Loading