Keywords: flow matching, self-supervised learning, time series data, short time Fourier transform, neural operator
TL;DR: We propose Flow-Guided Neural Operator (FGNO), a self-supervised framework for time-series data that extracts multi-scale features via varying network layers and noise levels.
Abstract: Self-supervised learning (SSL) is a powerful paradigm for learning from unlabeled time-series data. However, popular methods such as masked autoencoders (MAEs) rely on reconstructing inputs from a fixed, predetermined masking ratio.
Instead of this static design, we propose treating the corruption level as a new degree of freedom for representation learning, enhancing flexibility and performance.
To achieve this, we introduce the Flow-Guided Neural Operator (FGNO), a novel framework combining operator learning with flow matching for SSL training.
By leveraging Short-Time Fourier Transform (STFT) to enable computation under different time resolutions, our approach effectively learns mappings in functional spaces.
We extract a rich hierarchy of features by tapping into different network layers ($l$) and flow times ($s$) that apply varying strengths of noise to the input data.
This enables the extraction of versatile representations, from low-level patterns to high-level semantics, using a single model adaptable to specific tasks.
Unlike prior generative SSL methods that use noisy inputs during inference, we propose using clean inputs for representation extraction while learning representations with noise; this eliminates randomness and boosts accuracy.
We evaluate FGNO across three biomedical domains, where it consistently outperforms established baselines. Our method yields up to 35\% AUROC gains in neural signal decoding (BrainTreeBank), 16\% RMSE reductions in skin temperature prediction (DREAMT), and over 20\% improvement in accuracy and macro-F1 on SleepEDF under low-data regimes. These results highlight FGNO’s robustness to data scarcity and its superior capacity to learn expressive representations for diverse time-series applications.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 22102
Loading