Incorporating the Cycle Inductive Bias in Masked Autoencoders

Published: 05 Nov 2025, Last Modified: 05 Nov 2025NLDL 2026 SpotlightEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Masked autoencoder, self supervised learning, healthcare, masked learning, time series, .
TL;DR: Masked Autoencoder using a cycle inductive bias to reduce computational cost
Abstract: Many time series exhibit cyclic structure — for example, in physiological signals such as ECG or EEG — yet most representation learning methods treat them as generic sequences. We propose a masked autoencoder (MAE) framework that explicitly leverages cycles as an inductive bias for more efficient and effective time-series modelling. Our method decomposes sequences into cycles and trains the model to reconstruct masked segments at both the cycle and sequence level. This cycle-based decomposition shortens the effective sequence length processed by the encoder by up to a factor of ten in our experiments, yielding substantial computational savings without loss in reconstruction quality. At the same time, the approach exposes the encoder to a greater diversity of temporal patterns, as each cycle forms an additional training instance, which enhances the ability to capture subtle intra-cycle variations. Empirically, our framework outperforms three competitive baselines across four cyclic datasets, while also reducing training time on larger datasets.
Serve As Reviewer: ~Stuart_Gallina_Ottersen1, ~Kerstin_Bach1
Submission Number: 58
Loading