Spatio-Spectral Sequence Processing

Published: 02 Mar 2026, Last Modified: 02 Mar 2026ICLR 2026 Workshop GRaM PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: tiny paper (up to 4 pages)
Keywords: Sequence Models, Graph Neural Networks, Long-Range Interactions
TL;DR: We introduce Spatio-Spectral Sequence models, demonstrate their benefits on long-range classification benchmarks, and propose an extension for autoregressive modeling
Abstract: Long-sequence data have become ubiquitous in the last decade. Transformers, being the de facto standard for processing sequences, suffer from quadratic complexity in both memory and running time, which makes them prohibitively expensive for long sequences. To augment recurrent models with efficient global information exchange, we follow the recent developments in usage of spectral information for graph neural networks (GNNs). This paper introduces Spatio-Spectral Sequence (S2Seq) models, which augment arbitrary sequence architectures with a learnable spectral branch to capture global geometric structure. Using Long-Range Arena classification benchmark, we demonstrate that our approach can yield meaningful improvements, sometimes bridging the gap to the state-of-the-art performance. We also show that truncated and even approximated spectra can provide enough information to match the performance of a full FFT calculation. Furthermore, we propose a proof-of-concept extension of S2Seq for autoregressive prediction using recurrent window updates in subquadratic time.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Presenter: ~Nikita_Kostin2
Format: Yes, the presenting author will attend in person if this work is accepted to the workshop.
Funding: No, the presenting author of this submission does *not* fall under ICLR’s funding aims, or has sufficient alternate funding.
Serve As Reviewer: ~Nikita_Kostin2
Submission Number: 46
Loading