Temporal-Aware Test-Time Training via Self-Distillation for One-Shot Image-to-Video Segmentation

28 Sept 2024 (modified: 15 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: medical video analysis, one-shot video object segmentation, test-time training, self-distillation
Abstract: This paper introduces a novel task and approach for one-shot medical video object segmentation using static image datasets. We address the critical challenge of limited annotated video data in medical imaging by proposing a framework that leverages readily available labeled static images to segment objects in medical videos with minimal annotation---specifically, a ground truth mask for only the first frame. Our method comprises training a one-shot segmentation model exclusively on images, followed by adapting it to medical videos through a test-time training strategy. This strategy incorporates a memory mechanism to utilize spatiotemporal context and employs self-distillation to maintain generalization capabilities. To facilitate research in this domain, we present OS-I2V-Seg, a comprehensive dataset comprising 28 categories in images and 4 categories in videos, totaling 68,416 image/frame-mask pairs. Extensive experiments demonstrate the efficacy of our approach in this extremely low-data regime for video object segmentation, establishing baseline performance on OS-I2V-Seg. The code and data will be made publicly available.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13118
Loading