Abstract: Occluded person re-identification (Re-ID) is a challenging task, as various object-to-person (OTP) and person-to-person (PTP) occlusion scenarios cause diverse occlusion interference and target person feature loss problems in person matching. Most existing methods, which utilize auxiliary models to evaluate the unoccluded person parts for occlusion feature elimination, are inefficient and cannot handle the PTP occlusion scenarios and person feature loss problems. To solve these issues, we propose a novel Occlusion-Aware Feature Recover (OAFR) model. OAFR simulates diverse occlusions to facilitate the model perceiving OTP, PTP occlusions and recovers occluded query features with unoccluded retrieved gallery features. Concretely, the Prior Knowledge-based Occlusion Simulation method is firstly introduced to synthesize OTP, PTP occlusions and corresponding occlusion labels, empowering model target person perception and occlusion-aware capability through self-supervised learning. Afterward, the feature recovery module reconstructs occluded query features with corresponding unoccluded local features of the top- $K$ retrieved images by the visibility weighted average scheme, thus recovering the occluded query features to maintain more comprehensive features for better retrieval. Extensive experiments demonstrate that the proposed OAFR achieves superior performance to the state-of-the-art for both holistic and occluded Re-ID. Especially for Occluded-DukeMTMC dataset, OAFR outperforms the state-of-the-art by 6.0% for Rank-1 accuracy and 2.2% for mAP score.
Loading