Layer-Varying Deep Reservoir Computing Architecture

ICLR 2025 Conference Submission13387 Authors

28 Sept 2024 (modified: 24 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multivariate time series, imputation, reservoir computing networks, dynamical systems
Abstract: Data loss and corruption are common incidents that often lead to catastrophic consequences in both theoretical and experimental facets of data analytics. The aspiration to minimize the impacts of such consequences drives the demand for the development of effective data analytic tools and imputation methods to replace missing, corrupted, or artifacted data. The focus of this paper is on multivariate time series imputation, for which we develop a dynamical systems-theoretic deep learning approach. The central idea is to view a multivariate time series as a trajectory of a dynamical system. Then, we construct a deep reservoir computing architecture to model the temporal evolution of the system by using existing data in the time series. In particular, this architecture is composed of a cascade of echo state network (ESN) layers with diminishing reservoir sizes. We then propose a layer-by-layer training scheme, which gives rise to a deep learning-based time series imputation algorithm. We further provide a rigorous convergence analysis of this algorithm by exploiting the echo state property of ESN, and demonstrate the imputation performance as well as the efficiency of the training process by utilizing both synthetic and real-world datasets arising from diverse applications.
Supplementary Material: pdf
Primary Area: learning on time series and dynamical systems
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13387
Loading