Abstract: Recent progress in 3D display technologies has raised the demand for stylized 3D digital content. Previous approaches either perform style transfer on stereoscopic image pairs or reconstruct 3D environment with multiple view images. In this paper, we propose a novel view stylization framework that can convert a single 2D image into multiple stylized views. It is a two-stage solution that contains view synthesis and neural style transfer. We estimate dense optical flow between the source and novel views so that the style transfer model can produce consistent results. Experimental results show that our method significantly improves the consistency among views compared to the baseline method.
0 Replies
Loading