Abstract: Existing 3D garment reconstruction from 2D image methods using deep learning face challenges in recovering surface details of the garment. Furthermore, when draping 3D garments onto human bodies, shape and pose displacements are generated based on the target body to predict latent codes. This process may generate deformation displacements beyond the garment constraints due to differences in body shape and pose, thereby altering the garment structure. Hence, we propose a Garment Reconstruction and Draping framework (GRD): fitting parameterized garment templates into the human implicit field and restoring surface details through normals and regularization mechanism. We introduce Structure Preserving Constraints (SPC) into the process of draping garments. SPC strategy measures the Euclidean distance between global and local key features of the deformed garment and the original garment, training the model to minimize this distance during the generation of deformation displacements, effectively constraining garment structure changes. Experimental results demonstrate advancements in detail within our garment reconstruction method, while preserving the original garment design during the draping process.
Loading