Visible-Infrared Person Search: A Novel Benchmark and Solution

Published: 2024, Last Modified: 14 May 2025ICPR (14) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Person search aims to simultaneously localize and identify a query person from realistic and uncropped images, which consists of person detection and re-identification (Re-ID). Existing person search methods and datasets predominantly focus on the visible light domain, and have difficulty in alleviating modality discrepancies. Furthermore, existing visible-infrared person Re-ID methods struggle to adequately address occlusions and handle background interference effectively. To address the above issues simultaneously, we first construct a new large-scale dataset, Multi-Modality Person Search (MMPS), which tackles the lack of suitable benchmarks for person search in the visible-infrared domain. Encompassing challenges of complex background interferences and occlusions under modality discrepancies, MMPS includes 21,470 images and 1,012 identities across six different cameras. Furthermore, we propose a novel visible-infrared person search method that integrates detection and Re-ID into a progressive process. Specifically, Progressive Inclusion (PI) is proposed to explore backgrounds and provide adaptive proposals. To better tackle the complex occlusions under significant modality discrepancies, we present Discriminative Mix (DM) to synthesize more diverse samples, leveraging specific pattern map embedding. This strategy ensures that our model is not overfitted to specific patterns and is capable of identifying diverse and distinctive human parts. Extensive experiments demonstrate that our method (PI-DM) achieves state-of-the-art performance on the task of visible-infrared person search. Our dataset has been released on https://github.com/sysuchx/MMPS.
Loading