Masked Dual-Temporal Autoencoders for Semi-Supervised Time-Series Classification

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: semi-supervised learning, time-series classification, self-supervised learning, masked time-series modeling
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a novel masked modeling-based framework for semi-supervised time-series classification by capturing intricate temporal patterns at different time scales without exploring optimal masking ratios to enhance classification performance.
Abstract: In this study, we propose a novel framework for semi-supervised time-series classification based on masked time-series modeling, a recent advance in self-supervised learning effective for capturing intricate temporal structures within time series. The proposed method effectively extracts intrinsic semantic information from unlabeled instances by reflecting diverse temporal resolutions and considering various masking ratios during model training. Then, we incorporate the semantic information extracted from unlabeled time series with supervisory features, including hard-to-learn class information, learned from labeled ones to improve classification performance. Through extensive experiments on semi-supervised time-series classification, we demonstrate the superiority of our approach by achieving state-of-the-art performance.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2242
Loading