Online Adaptive Slimmable Network for Source-free Unsupervised Domain Adaptation

Seungmo Seo, Jongsu Youn, Seungjin Jung, Minji Kwak, Heegon Jin, Hojoon Jung, Jongwon Choi

Published: 01 Jan 2026, Last Modified: 11 Feb 2026IEEE AccessEveryoneRevisionsCC BY-SA 4.0
Abstract: When the target domain is well defined, only a small portion of a deep neural network is often sufficient to achieve satisfactory performance. However, traditional model compression method typically requires access to the original dataset and additional labeled annotations, an approach that is both costly and often restricted by privacy concerns. In this study, we propose a novel slimmable adaptation framework designed for source-free unsupervised domain adaptation (SFUDA). Specifically, we introduce a block transformable network that can be slimmable into multiple sub-models without requiring any additional fine-tuning or retraining. Furthermore, we develop an online multi-path stabilization strategy that enhances the performance of these slimmed sub-models using only unlabeled target data. Comprehensive experiments across various scenarios confirm that the proposed method enables the slimmable model to adapt effectively to diverse domain shifts. Our approach surpasses existing unsupervised domain adaptation and model compression techniques that rely on labeled data.
Loading