- TL;DR: A feed-forward neural style transfer network which can transfer unseen arbitrary styles
- Abstract: In this paper, we propose a feed-forward neural style transfer network which can transfer unseen arbitrary styles. To do that, first, we extend the fast neural style transfer network proposed by Johnson et al. (2016) so that the network can learn multiple styles at the same time by adding a conditional input. We call this as “a conditional style transfer network”. Next, we add a style condition network which generates a conditional signal from a style image directly, and train “a conditional style transfer network with a style condition network” in an end-to-end manner. The proposed network can generate a stylized image from a content image and a style image in one-time feed-forward computation instantly.
- Conflicts: uec.ac.jp