Spectral Graph Wavelets Meet Message Passing: Convergence Rates and Expressive Power of Multi-Resolution GNNs
Keywords: Graph neural networks, spectral graph wavelets, message passing, MPNN, Weisfeiler-Lehman, universal approximation, expressive power, multi-resolution, GCN, GAT
TL;DR: WaveletMPNN uses $O(\log N)$ wavelet scales to approximate graph functions and exceeds 3-WL expressiveness; 3--8\% gains over GCN/GAT/GIN with fewer parameters.
Abstract: We establish a rigorous mathematical framework connecting spectral graph wavelet theory with message-passing neural networks, resolving an open question about the expressive power of multi-resolution graph architectures. We introduce WaveletMPNN, which replaces standard message passing with wavelet-localized aggregation at multiple spectral scales. Our theoretical contributions include: (1) a Universal Approximation Theorem proving WaveletMPNN can approximate any continuous graph function to arbitrary precision with $O(\log N)$ wavelet scales, where $N$ is the number of nodes---exponentially more efficient than the $\Omega(N)$ neighborhood aggregation depth required by standard MPNNs; (2) a separation theorem showing WaveletMPNN strictly exceeds 3-WL in expressive power by leveraging spectral localization; (3) convergence rate analysis showing $O(J^{-s})$ approximation error where $J$ is the number of scales and $s$ is the Sobolev regularity of the target function on the graph. On molecular property prediction (ZINC, OGB-MolHIV), social network classification (IMDB, COLLAB), and point cloud segmentation (ModelNet40), WaveletMPNN achieves 3--8\% improvements over GCN, GAT, GIN, and GPS baselines while using 40\% fewer parameters due to multi-resolution compression.
Submission Number: 150
Loading