Beyond Aligned Target Face: StyleGAN-Based Face-Swapping via Inverted Identity Learning

Published: 01 Jan 2024, Last Modified: 13 Feb 2025ICME Workshops 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recent face-swapping methods based on StyleGAN encounter limitations in transferring identity exclusively to aligned and fixed-size target images, constraining their applicability in real-world scenarios. In this work, we introduce a pioneering Non-Aligned Target Face Swapping method that enables the transfer of identity from an aligned source image to any non-aligned target image. The direct reconstruction of identity representation from an aligned source image and other attribute representations from non-aligned target images often leads to generated images with aligned identity but unaligned attributes. To overcome this challenge, we propose an inverted identity learning strategy that calculates identity loss after transforming face-swapping results back into a cropped alignment space. Comprehensive qualitative and quantitative experiments showcase the effectiveness of our method compared to current state-of-the-art StyleGAN-based techniques on both aligned and simulated non-aligned data. Furthermore, we validate the generalizability of our method using non-aligned face inputs from diverse out-of-domain datasets, enhancing its potential for real-world applications.
Loading