Adapting Informative Structures for Cross-Domain Few-Shot Segmentation

23 Sept 2024 (modified: 14 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: cross-domain few-shot segmentation, few-shot segmentation, test-time training
Abstract: Cross-domain few-shot segmentation (CD-FSS) aims to segment objects of novel classes under domain shifts, using only a few mask-annotated support samples. However, directly applying pretrained CD-FSS models to unseen domains is often suboptimal due to their limited coverage of domain diversity by fixed parameters trained on source domains. Moreover, simply adjusting hand-selected model parameters, such as test-time training, typically neglects the distinct domain gaps and characteristics of target domains. To address these issues, we propose adapting informative model structures for target domains by learning domain characteristics from few-shot labeled support samples during inference. Specifically, we first adaptively identify domain-specific model structures by measuring parameter importance using a novel structure Fisher score in a data-dependent manner. Then, we progressively train the selected informative model structures with hierarchically constructed training samples, progressing from fewer to more support shots. Our method selectively and gradually adapts the model to target domains, optimizing model adaptation, minimizing overfitting risks, and maximizing the use of limited support data. The resulting Informative Structure Adaptation (ISA) method effectively addresses domain shifts and equips existing few-shot segmentation models with flexible adaptation capabilities for new domains, eliminating the need to redesign or retrain CD-FSS models on base data. Extensive experiments validate the effectiveness of our method, demonstrating superior performance across multiple CD-FSS benchmarks.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2815
Loading