TEI-Face: A Temporal Expression and Identity Stability Oriented Face Swapping

Biying Li, Zhiwei Liu, Jinqiao Wang

Published: 01 Jan 2026, Last Modified: 09 Nov 2025CrossrefEveryoneRevisionsCC BY-SA 4.0
Abstract: Person-agnostic face swapping has gained significant attention in recent years. It is expected to address several challenges, such as effectively handling diverse real-world scenarios and broadening its applicability across various domains. However, expression and pose information in facial images is difficult to decouple from identity information and is susceptible to interference from the background. Contemporary algorithms struggle to maintain the temporal consistency of expressions and identities. Analysis of this issue reveals that, in face swapping scenarios, background information in dynamic videos exhibits less variation than facial regions and can be more readily replaced. In this paper, we propose a Temporal Expression and Identity stability oriented Face swapping method (TEI-Face), which reformulates face swapping into two subtasks: motion transfer and background replacement. By employing a face reenactment model as the backbone, we design a background correction module to perform background feature alignment and warping, and integrate it with the driven source face. In addition, a cycle-consistency verification network is employed to implement a self-supervised procedure that ensures identity consistency. Experiments on mainstream benchmarks FF++ and VFHQ demonstrate that TEI-Face achieves state-of-the-art face swapping results in terms of both identity and expression consistency.
Loading