Arbitrary Scale Texture Synthesis with Feature Map Swapping

Published: 2024, Last Modified: 12 Apr 2025ICIC (7) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Texture synthesis is a technique widely used in computer vision. Existing learning-based methods typically use fixed network structure, and they can only generate images that are the same size or integer multiples of the input sample. In this paper, we propose a swapping-aware texture synthesis method based on feature mapping using a deep generative model. To optimize the loss function, we conduct a dedicated exchange algorithm that operates directly in the feature map space. The texture matching is optimized by matching the feature map between the original image and the generated image. The model can generate texture images of any size based on input samples. The generated results can be extended from the inside of the image to the surrounding, resulting in an image larger than the original input size. The experimental results show that the proposed method effectively preserves more high-frequency details while maintaining the consistency of the generated content and texture, and obtains more realistic synthesized results.
Loading