Abstract: Image-based garment transfer systems aim to swap the desired clothes from a model to arbitrary users. However, existing works cannot provide the capacity for users to try on various fashion articles according to their wishes, i.e., users can decide which article (e.g., tops, pants or both) to be swapped. In this paper, we propose an Inpainting-based Virtual Try-On Network (I-VTON) which allows the user to try on arbitrary clothes from the model image in a selective manner. To realize the selectivity, we reshape the virtual try-on as a task of image inpainting. Firstly, the texture from the garment and the user are extracted respectively to form a coarse result. In this phase, users can decide which clothes they hope to try on via an interactive texture control mechanism. Secondly, the missing regions in the coarse result are recovered via a Texture Inpainting Network (TIN). We introduce a triplet training strategy to ensure the naturalness of the final result. Qualitative and quantitative experimental results demonstrate that I-VTON outperforms the state-of-the-art methods on both the garment details and the user identity. It is also confirmed our approach can flexibly transfer the clothes in a selective manner.
Loading