Abstract: Incremental object detection (IOD) aims to sequentially learn new classes, while maintaining the capability to locate and identify old ones. Prior methodologies mainly tackle catastrophic forgetting through knowledge distillation and exemplar replay, ignoring the conflict between limited model capacity and increasing knowledge. In this paper, we propose the Dynamic object Query-based DEtection TRansformer (DyQ-DETR), which incrementally expands the model representation ability to achieve stability-plasticity tradeoff. First, a new set of learnable object queries are fed into the decoder to represent new classes. Second, we propose the isolated bipartite matching for object queries in different phases, based on disentangled self-attention. Thanks to the separate supervision and computation over object queries, we further present the risk-balanced partial calibration for effective exemplar replay. Extensive experiments demonstrate that DyQ-DETR significantly surpasses the state-of-the-art methods, with limited parameter overhead. The code is available at https://github.com/THUzhangjic/DyQ-DETR.
External IDs:dblp:conf/icassp/Zhang0C0W25
Loading