RF Prior: Preserving Global-Context Priors for Efficient Instance Segmentation Transfer

17 Sept 2025 (modified: 12 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: adaptive transfer, instance segmentation, receptive field, representation reuse, YOLO
TL;DR: We make backbone named RF Prior and Multi-Scale Attentive Decoder with Automatic BBox-to-Polygon Generation.
Abstract: We present an efficient transfer-learning framework that reparameterizes a state-of-the-art detector backbone—instantiated with a YOLO-family model—for polygon based instance segmentation. Our key idea is a Receptive-Field Prior: the largest-receptive-field block (P5) of the backbone, pretrained for detection, is kept fixed to preserve global object context, while intermediate low- level blocks (P3–P4) are fine-tuned for boundary precision. We formalize this with a block-diagonal Gaussian prior on backbone weights, yielding a MAP objective that acts as implicit adaptation. Multi-scale features from P3–P5 are fused in a attentive decoder to predict per-instance polygons. Experiments show strong and stable performance compared with scratch training or naïve tuning strategy. This approach\footnote{Our framework (code \& dataset) will be released upon acceptance as Ultralytics-compatible pipeline.} highlights that carefully constrained reuse of high-level detector features—guided by an explicit inductive bias—can yield strong segmentation.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 8674
Loading