Few-Shot Generative Model Adaptation via Style-Guided Prompt

Published: 01 Jan 2024, Last Modified: 13 Nov 2024IEEE Trans. Multim. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Few-shot generative model adaptation aims to obtain an excellent model that generates high-quality and high-diversity images with a few training data. However, a small number of training samples often leads to overfitting of the model, which leads the generated images to lose generative diversity. Existing methods either fail to preserve structural information, leading to overfitting phenomena, or maintain too much structure in the source domain, failing to transfer styles well. To solve these problems, we propose an effective generative model adaptation method with style-guided prompt to balance generative diversity and style transformation. Firstly, by freezing the structure-related parameters of the pre-trained model, we preserve the robustness and diversity of the source domain's generative model, which helps to mitigate overfitting and maintain diversity in the generated images. Secondly, the proposed style-guided prompt method allows us to capture the target domain's style more naturally, facilitating more accurate and efficient style transfer in the generated images. Thirdly, the multi-layer deep contrastive loss is designed to further enhance the generated images' diversity and quality by preserving the generative diversity in the source domain without using extra target domain data. Extensive quantitative and qualitative experiments prove the effectiveness and superiority of our method.
Loading