Enhancing Video Stylization with Integral Noise

Published: 01 Jan 2025, Last Modified: 03 Oct 2025ICNC 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Temporal coherence between video frames remains a persistent challenge in video stylization. Integral noise, a method designed to preserve temporal correlations during diffusion, has been selectively employed within intra-frame correspondence mechanisms to enhance frame-to-frame consistency. However, its application to intra-frame correspondence often results in artifacts such as blurring. This paper investigates the integration of integral noise into video stylization workflows to improve temporal coherence across video frames. Furthermore, the incorporation of the IP-Adapter enables reference image-based stylization, expanding the approach’s versatility. The combined methodology delivers superior outcomes, producing videos that are not only temporally coherent but also stylistically consistent and of high quality. Its effectiveness is validated in professional workflows, such as creating background animations for AI-assisted movie production, underscoring its value in advanced video editing and production applications.
Loading