Abstract: Image-based virtual try-on aims to seamlessly fit in-shop clothing to a person image while maintaining pose consistency. Existing methods commonly employ the thin plate spline (TPS) transformation or appearance flow to deform in-shop clothing for aligning with the person's body. Despite their promising performance, these methods often lack precise control over fine details, leading to inconsistencies in shape between clothing and the person's body as well as distortions in exposed limb regions. To tackle these challenges, we propose a novel shape-guided clothing warping method for virtual try-on, dubbed SCW-VTON, which incorporates global shape constraints and additional limb textures to enhance the realism and consistency of the warped clothing and try-on results. To integrate global shape constraints for clothing warping, we devise a dual-path clothing warping module comprising a shape path and a flow path. The former path captures the clothing shape aligned with the person's body, while the latter path leverages the mapping between the pre- and post-deformation of the clothing shape to guide the estimation of appearance flow. Furthermore, to alleviate distortions in limb regions of try-on results, we integrate detailed limb guidance by developing a limb reconstruction network based on masked image modeling. Through the utilization of SCW-VTON, we are able to generate try-on results with enhanced clothing shape consistency and precise control over details. Extensive experiments demonstrate the superiority of our approach over state-of-the-art methods both qualitatively and quantitatively.
Primary Subject Area: [Experience] Multimedia Applications
Secondary Subject Area: [Experience] Art and Culture, [Generation] Generative Multimedia
Relevance To Conference: In recent years, the e-commerce industry has experienced rapid development, with a growing number of consumers choosing to purchase clothing online. Simultaneously, the fashion sector has increasingly captured public interest. In this context, our work provides an image-based virtual try-on method. Given an in-shop clothing image and a person image, our method can synthesize a photo-realistic try-on result to simulate the process of transferring new clothing to the person's body, which not only enhances the online shopping experience for consumers but is also applicable to various research tasks related to fashion, such as fashion image retrieval, popularity prediction, and clothing recommendation. Furthermore, by introducing extra global shape constraints, our method is liberated from the restriction of generating try-on results solely based on specific input clothing shapes. This versatility highlights that our method extends beyond conventional virtual try-on applications, potentially sparking innovative ideas for novel computer vision tasks in the fashion and clothing domain.
Supplementary Material: zip
Submission Number: 1789
Loading