Keywords: Head generation, Artistic control, Texture synthesis
Abstract: Fulfilling a precise artistic vision while creating realistic virtual human characters requires extensive manual efforts. This paper proposes a novel approach to streamline this process, generating 3D head geometry and enabling precise control over skin tone and fine-grained modification of facial details such as wrinkles. User-specified modifications are conveniently propagated over the entire assets by our models, effectively reducing the amount of manual intervention needed to achieve a specific artistic vision. This is achieved by our proposed texture-generation pipeline that leverages correlations between texture and geometry for different head shapes, ethnicity, and gender. Our method allows for accurate skin-tone control while keeping the other appearance factors unchanged. Lastly, we introduce a method for fine-grained control over the details of the generated heads, which enables artists to freely modify one texture map and have changes cohesively propagated to the other maps. Our experiments show that our method produces diverse and well-behaved geometries, thanks to our GNN-based model, and synthesizes textures that are coherent with the geometry using a CNN-based GAN. We also achieve precise and intuitive skin-tone control through a single control parameter and obtain plausible textures for both face skin and lips. Our experiments with fine-grained editing on common artists' tasks, such as adding wrinkles or removing a beard, showcase how our method simplifies the head generation workflow by cohesively propagating changes to all texture maps.
Supplementary Material: zip
Camera Ready Version: zip
Submission Number: 3
Loading