Ladder Siamese Network: A Method and Insights for Multi-Level Self-Supervised Learning

Published: 01 Jan 2023, Last Modified: 29 Sept 2024ICIP 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In Siamese-network-based self-supervised learning (SSL), multilevel supervision (MLS) is a natural extension to enforce intermediate representations’ consistency against data augmentations. Although existing studies have incorporated MLS to boost their system performances in combination with other ideas, vanilla MLS has not been deeply analyzed. Here, we extensively investigate how MLS works and how much impact it has on SSL performance with various training settings to understand the effectiveness of MLS by itself. For this investigation, we develop a simple Siamese-SSL-based MLS framework Ladder Siamese Network, equipped with multi-level, non-contrastive, and global/local self-supervised training losses. We show that the proposed framework can simultaneously improve BYOL baselines in classification, detection, and segmentation solely by adding MLS. In comparison with the state-of-the-art methods, our Ladder-based model achieves competitive and balanced performances in all tested benchmarks without causing large degradation in any of them, which suggests the usability for building a multi-purpose backbone.
Loading