Abstract: Garment transfer shows great potential in realistic applications with the goal of transfering outfits across different people images. However, garment transfer between
images with heavy misalignments or severe occlusions still
remains as a challenge. In this work, we propose Complementary Transfering Network (CT-Net) to adaptively model
different levels of geometric changes and transfer outfits
between different people. In specific, CT-Net consists of
three modules: i) A complementary warping module first
estimates two complementary warpings to transfer the desired clothes in different granularities. ii) A layout prediction module is proposed to predict the target layout, which
guides the preservation or generation of the body parts in
the synthesized images. iii) A dynamic fusion module adaptively combines the advantages of the complementary warpings to render the garment transfer results. Extensive experiments conducted on DeepFashion dataset demonstrate
that our network synthesizes high-quality garment transfer
images and significantly outperforms the state-of-art methods both qualitatively and quantitatively. Our source code
will be available online.
0 Replies
Loading