Masking the Gaps: An Imputation-Free Approach to Time Series Modeling with Missing Data

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time-Series, Deep Learning, Masked Autoencoders, Missing Data, Data Imputation
TL;DR: We propose a novel imputation-free approach of handling missing values in time series that can be trained in an end-to-end manner.
Abstract: Modeling time series is important in a variety of domains, yet it is challenged by the presence of missing values in real-world time-series datasets. Traditional frameworks for modeling time-series with missing values typically involve a two-step process, where the missing values are first filled-in using some imputation technique, followed by a time-series modeling approach on the imputed time-series. However, existing two-stage approaches suffer from two major drawbacks: first, the propagation of imputation errors into subsequent time-series modeling performance, and second, the inherent trade-offs between imputation efficacy and imputation complexity. To this end, we propose a novel imputation-free approach for handling missing values in time series termed {Miss}ing Feature-aware {T}ime {S}eries {M}odeling ({MissTSM}) with two main innovations. {First}, we develop a novel embedding scheme that treats every combination of time-step and feature (or channel) as a distinct token, encoding them into a high-dimensional space. {Second}, we introduce a novel {Missing Feature-Aware Attention (MFAA) Layer} to learn latent representations at every time-step based on partially observed features. We evaluate the effectiveness of MissTSM in handling missing values over multiple benchmark datasets using two synthetic masking techniques: masking completely at random (MCAR) and periodic masking, and a real-world missing-value dataset.
Primary Area: learning on time series and dynamical systems
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12015
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview