A Little Selection Goes A Long Way! Parameter Efficient Domain Adaptive Object Detection via Noise-Guided Layer Selection

15 Sept 2025 (modified: 14 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: domain adaptation, object detection
Abstract: Domain Adaptive Object Detection (DAOD) aims to adapt a detector trained on a labeled source domain so that it generalizes well to a target domain with a different data distribution. Existing DAOD methods often fine-tune the entire source model on the target domain, which leads to parameter inefficiency and limits practical deployment on edge devices. In this paper, we demonstrate that fine-tuning only a subset of layers within the backbone can achieve comparable or even better performance. We propose \textbf{N}oise-\textbf{G}uided \textbf{L}ayer \textbf{S}election, \textbf{NGLS}, a method to identify backbone layers that best support learning domain-invariant representations. NGLS perturbs an auxiliary dataset with Gaussian noise, measures the cosine similarity of features across layers, and selects those layers whose similarity over the threshold. To demonstrate the effectiveness of our method, we integrate NGLS into two distinct DAOD tasks, Source-Free Object Detection (SFOD) and Unsupervised Domain Adaptive Object Detection (UDAOD). To further validate the generality of our method, we evaluate NGLS with two widely used detectors, Faster R-CNN (FRCNN) and Deformable DETR (DeDETR). The experimental results demonstrate that our method significantly reduces the number of required trainable parameters during adaptation while maintaining comparable or even surpassing performance compared to baseline methods. Specifically, in the Cityscapes to Foggy Cityscapes adaptation, we improve the performance of a DeDETR-based SFOD method by 0.8\% mAP while reducing 98\% of the model’s trainable parameters, and we improve the performance of an FRCNN-based SFOD method by 2.1\% mAP while reducing 93\% of the trainable parameters.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 5397
Loading