Abstract: Silhouette-based gait recognition is a popular biometric modality because of its non-intrusive nature and its ability to capture identity information from a distance and at low resolution. However, cross-view gait recognition presents significant challenges because the appearance changes drastically across different viewpoints. Existing gait datasets are sparse in view coverage, limiting the generalization of models to novel views. The evaluation of current state-of-the-art gait recognition models is confined to the views present in the training dataset, posing challenges for practical application scenarios. In this paper, we propose a three-dimensional (3D) human-model-guided view-morphing framework to generate high-quality gait silhouettes for arbitrary views, enhancing gait recognition performance for novel views unseen during training. By leveraging the body topology of the 3D human model, unlike existing GAN-based or deformation-based methods that rely only on two-dimensional images, the proposed method enables fine control over the output views and achieves plausible deformations, even for self-occluded body parts, through the supervision of the pseudo ground-truth deformation field computed from the 3D model. To enhance the realism of silhouette generation, we use clothed silhouettes alongside skin silhouettes rendered from the 3D model (e.g., the SMPL) to train the deformation estimation network to account for clothing and accessories in the final output. We evaluated our approach on the OU-MVLP, CASIA-B, and CASIA-E datasets, demonstrating significant improvements in gait recognition performance, particularly in view-limited scenarios. For example, recovering 6.47% rank-1 recognition rate on OU-MVLP using three close views only and an average of 24.36% and 8.34% on CASIA-B and CASIA-E normal clothing, respectively, using a single view. Our framework increases the robustness of the cross-view problem, enhancing the applicability of gait recognition in real-world settings such as surveillance and security.
External IDs:dblp:journals/access/AljazaerlyXWLY25
Loading