Purified Distillation: Bridging Domain Shift and Category Gap in Incremental Object Detection

Published: 01 Jan 2024, Last Modified: 13 Nov 2024ACM Multimedia 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Incremental Object Detection (IOD) simulates the dynamic data flow in real-world applications, which require detectors to learn new classes or adapt to new domains while retaining knowledge from previous tasks. Most existing IOD methods focus only on class incremental learning, assuming all data comes from the same domain. However, this is hardly achievable in practical applications, as images collected under different conditions often exhibit completely different characteristics, such as lighting, weather, style, etc. Class IOD methods suffer from performance degradation in these scenarios with domain shifts. To bridge domain shifts and category gaps in IOD, we propose Purified Distillation (PD), where we use a set of trainable queries to transfer the teacher's attention on old tasks to the student and adopt the gradient reversal layer to guide the student to learn the teacher's feature space structure from a micro perspective, which has not been extensively studied in previous works. Meanwhile, PD combines classification confidence with localization confidence to purify the most meaningful output nodes, so that the student model inherits a more comprehensive teacher knowledge. Extensive experiments across various IOD settings on six widely used datasets show that PD significantly outperforms state-of-the-art methods. Even after five steps of incremental learning, our method can preserve 60.6% mAP on the first task, while compared methods can only maintain up to 55.9%.
Loading