Visible-Thermal Person Reidentification in Visual Internet of Things With Random Gray Data Augmentation and a New Pooling Mechanism

Abstract: Visible–thermal person reidentification (VT-ReID) is an emerging cross-modality matching problem, which aims to identify the same person across the daytime visible modality and nighttime thermal modality in the Internet of Things. Existing cutting-edge approaches consistently attempt to exploit image generation technique to generate cross-modality images or design various feature-level constraints to align feature distribution of heterogeneous data. However, color variations originating from the different imaging processes of spectrum cameras remain unsolved, which leads to suboptimal feature representations. In this article, we present a simple but very effective data augmentation method named Random Gray for the cross-modality matching task. Given a training sample, Random Gray randomly selects a rectangular region and translates it to grayscale. In this process, training images with fusing various levels of visible and grayscale information are generated, thereby reducing the risk of overfitting and making the model robust to color variations. Besides, we introduce a novel pooling method called softpooling to retain more information in the reduced activation maps. With softpooling layer, the network can learn more discriminative person features and further boost its retrieval performance. We conduct extensive experiments on publicly available cross-modality Re-ID data sets (SYSU-MM01 and RegDB) to demonstrate the effectiveness of our proposed method. Experimental results show that Random Gray and softpooling strategies yield significant accuracy improvement, and they can be utilized as training tricks for further VT-ReID research.
0 Replies
Loading