Graph Adaptive Autoregressive Moving Average Models

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 spotlightposterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We introduce GRAMA, an ARMA-based framework that preserves permutation equivariance, and adapts coefficients via selective attention for long-range propagation. Experimental results on 22 datasets demonstrate its effectiveness.
Abstract: Graph State Space Models (SSMs) have recently been introduced to enhance Graph Neural Networks (GNNs) in modeling long-range interactions. Despite their success, existing methods either compromise on permutation equivariance or limit their focus to pairwise interactions rather than sequences. Building on the connection between Autoregressive Moving Average (ARMA) and SSM, in this paper, we introduce GRAMA, a Graph Adaptive method based on a learnable ARMA framework that addresses these limitations. By transforming from static to sequential graph data, GRAMA leverages the strengths of the ARMA framework, while preserving permutation equivariance. Moreover, GRAMA incorporates a selective attention mechanism for dynamic learning of ARMA coefficients, enabling efficient and flexible long-range information propagation. We also establish theoretical connections between GRAMA and Selective SSMs, providing insights into its ability to capture long-range dependencies. Experiments on 26 synthetic and real-world datasets demonstrate that GRAMA consistently outperforms backbone models and performs competitively with state-of-the-art methods.
Lay Summary: Graphs are used to represent complex systems like social networks, molecules, or traffic patterns, by capturing interactions between different nodes within the graph. However, many machine learning models, especially graph neural networks (GNNs), struggle to capture long-range interactions, where distant parts of the graph need to influence each other. This limitation, known as oversquashing, occurs when information becomes bottlenecked as it moves across the graph, preventing the model from effectively learning global patterns. Our research introduces GRAMA, a new method that reimagines how graphs are processed within sequential frameworks such as ARMA and state-space models. Unlike earlier approaches that flatten a graph into a sequence, GRAMA constructs a sequence of graphs, enabling the model to capture long-range interactions while preserving permutation equivariance -- a principle that ensures the output remains consistent regardless of how the graph’s nodes are ordered. This is essential for respecting the symmetry and structure of graph data. Inspired by signal processing methods, GRAMA selectively integrates past and current graph states and residuals, functioning like a memory mechanism to retain and propagate information more effectively across the graph. We also provide a theoretical analysis showing how the framework offered by GRAMA mitigates oversquashing, improving the ability to capture long-range dependencies. In our empirical experiments, across 26 diverse datasets, GRAMA consistently outperforms standard GNNs while remaining efficient and robust.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Graph Neural Networks, Auto-regressive Moving Average
Submission Number: 4819
Loading