Unseen Style Transfer Based on a Conditional Fast Style Transfer NetworkDownload PDF

29 Nov 2024 (modified: 11 Mar 2017)ICLR 2017Readers: Everyone
Abstract: In this paper, we propose a feed-forward neural style transfer network which can transfer unseen arbitrary styles. To do that, first, we extend the fast neural style transfer network proposed by Johnson et al. (2016) so that the network can learn multiple styles at the same time by adding a conditional input. We call this as “a conditional style transfer network”. Next, we add a style condition network which generates a conditional signal from a style image directly, and train “a conditional style transfer network with a style condition network” in an end-to-end manner. The proposed network can generate a stylized image from a content image and a style image in one-time feed-forward computation instantly.
TL;DR: A feed-forward neural style transfer network which can transfer unseen arbitrary styles
Conflicts: uec.ac.jp
5 Replies

Loading