MambaSL: Exploring Single-Layer Mamba for Time Series Classification

Published: 26 Jan 2026, Last Modified: 01 May 2026ICLR 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: time series classification, single-layer Mamba, modular selective SSM, multi-head adaptive pooling, skip connection
TL;DR: We introduce MambaSL, a minimally redesigned single-layer Mamba that achieves state-of-the-art accuracy on the UEA30 benchmark, with reproducible evaluation covering all baselines.
Abstract: Despite recent advances in state space models (SSMs) such as Mamba across various sequence domains, research on their standalone capacity for time series classification (TSC) has remained limited. We propose MambaSL, a framework that minimally redesigns the selective SSM and projection layers of a single-layer Mamba, guided by four TSC-specific hypotheses. To address benchmarking limitations—restricted configurations, partial University of East Anglia (UEA) dataset coverage, and insufficiently reproducible setups—we re-evaluate 20 strong baselines across all 30 UEA datasets under a unified protocol. As a result, MambaSL achieves state-of-the-art performance with statistically significant average improvements, while ensuring reproducibility via public checkpoints for all evaluated models. Together with visualizations, these results demonstrate the potential of Mamba-based architectures as a TSC backbone.
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 11231
Loading