Sttcnerf: Style Transfer of Neural Radiance Fields for 3d Scene Based on Texture Consistency Constraint

Published: 01 Jan 2024, Last Modified: 13 May 2025ICME 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Neural radiance fields (NeRF) have been successfully applied to many visual tasks. Due to the limitations of traditional style transfer methods in the 3D domain, style transfer methods based on NeRF have emerged. However, existing methods fail to generate stylized images with clear scene textures and high cross-view consistency. Therefore, we design STTCNeRF, a novel method for 3D scene style transfer. We propose a texture consistency loss based on our framework's Merge Net to capture the source textures and inherent consistency, achieving clear and consistent stylized effects. Meanwhile, we propose comprehensive latent style codes as conditional features, remapping them through our proposed Style Net to obtain style vectors conforming to the distributions of 2D stylized images. We utilize the Merge Net to render stylized images that are both reasonable and natural. Extensive experimental results show that STTCNeRF outperforms existing methods in terms of visual perception, image quality and cross-view consistency.
Loading