Studying Phase Transitions in Contrastive Learning With Physics-Inspired DatasetsDownload PDF

Published: 03 Mar 2023, Last Modified: 01 Apr 2023Physics4ML PosterReaders: Everyone
Keywords: Contrastive learning, Phase transitions, Physics, Visualizations, Representation learning, training dynamics
TL;DR: We provide detailed observation and analysis of a phase transition in contrastive self-supervised learning by using a physics-inspired dataset.
Abstract: In recent years contrastive learning has become a state-of-the-art technique in representation learning, but the exact mechanisms by which it trains are not well understood. By focusing on physics-inspired datasets with low intrinsic dimensionality, we are able to visualize and study contrastive training procedures in better resolution. We empirically study the geometric development of contrastively learned embeddings, discovering phase transitions between locally metastable embedding conformations towards an optimal structure. Ultimately we show a strong experimental link between stronger augmentations and decreased training time for contrastively learning more geometrically meaningful representations.
Supplementary Material: zip
0 Replies

Loading