Self-supervised Representation Learning from Random Data Projectors

Published: 28 Oct 2023, Last Modified: 29 Nov 2023TRL @ NeurIPS 2023 OralEveryoneRevisionsBibTeX
Keywords: Representation learning, Self-supervised learning, random data projections, domain-agnostic representation learning, tabular representation learning
TL;DR: We propose a new domain-agnostic self-supervised learning framework using random data projections
Abstract: Self-supervised representation learning SSRL has advanced considerably by exploiting the transformation invariance assumption under artificially designed data augmentations. While augmentation-based SSRL algorithms push the boundaries of performance in computer vision and natural language processing, they are often not directly applicable to other data modalities such as tabular and time-series data. This paper presents an SSRL approach that can be applied to these data modalities because it does not rely on augmentations or masking. Specifically, we show that high-quality data representations can be learned by reconstructing random data projections. We evaluate the proposed approach on real-world applications with tabular and time-series data. We show that it outperforms multiple state-of-the-art SSRL baselines and is competitive with methods built on domain-specific knowledge. Due to its wide applicability and strong empirical results, we argue that learning from randomness is a fruitful research direction worthy of attention and further study.
Slides: pdf
Submission Number: 22