On the Use of Smooth-L1 Approximation in Echo State Networks for Sparse and Efficient Temporal Modeling

Published: 2025, Last Modified: 09 Jan 2026IDEAL (1) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Echo State Networks, a lightweight approach within the reservoir computing paradigm, offer an efficient solution for processing temporal data under resource constraints. Despite their efficiency, standard architectures depend on dense output layers, which limit interpretability, increase inference cost, and hinder deployment in communication-restricted scenarios. This work, for the first time, explores a smooth sparsification method for the readout layer based on Smooth-L1 regularization, which enables efficient, unconstrained gradient-based optimization while promoting sparsity. Evaluated on datasets encompassing regression and classification tasks, SmoothL1-ESN achieves essential readout weight sparsity while maintain competitive performance compared to baseline model. These results confirm the method’s ability to balance sparsity and performance on diverse datasets, making it well-suited for scenarios demanding efficiency and communication reduction, such as distributed or federated learning systems. The code is publicly available at https://github.com/oroboro235/SmoothL1_ESN.
Loading