Contrastive Representation based Active Learning for Time Series

Published: 01 Jan 2022, Last Modified: 29 Oct 2024DASC/PiCom/CBDCom/CyberSciTech 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Active Learning designs query strategies to select the most representative samples to be labeled by an oracle in an attempt to maximize the model’s performance while minimizing the labeling workload. We propose REAL, a new pooling-based active learning algorithm for time series data that learns the query strategy and optimizes the representation model in a contrastive manner. To initialize the process, a cluster module is employed to select the first sample set for labeling. Subsequent samples are selected through a contrastive loss function from three complementary perspectives, self-consistency, attraction to similar samples, and repulsion of disparate samples. Concurrently, the contrastive loss is also used to update the representation model. We evaluate our method on various time series classification tasks against state-of-the-art algorithms and demonstrate gains or comparable performance for an equal number of labeled samples.
Loading