Con4m: Unleashing the Power of Consistency and Context in Classification for Blurred-Segmented Time Series

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Time series classification, Label consistency learning, Context-aware time series model, Blurred-segmented time series
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Blurred-Segmented Time Series (BST) has emerged as a prevalent form of time series data in various practical applications, presenting unique challenges for the Time Series Classification (TSC) task. The BST data is segmented into continuous states with inherently blurred transitions. These transitions lead to inconsistency in annotations among different individuals due to experiential differences, thereby hampering model training and validation. However, existing TSC methods often fail to recognize label inconsistency and contextual dependencies between consecutive classified samples. In this work, we first theoretically clarify the connotation of valuable contextual information. Based on these insights, we incorporate prior knowledge of BST data at both the data and class levels into our model design to capture effective contextual information. Furthermore, we propose a label consistency training framework to harmonize inconsistent labels. Extensive experiments on two public and one private BST data fully validate the effectiveness of our proposed approach, Con4m, in handling the TSC task on BST data.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3266
Loading