Abstract: Few-shot generative model adaption is a challenging task that aims to adapt a generative model pre-trained on a large-scale source domain dataset to the target domain with limited training samples. Current methods do their best to transfer source domain knowledge to the target generator in different ways. However, the over-fitting problem has always been a thorny problem in model adaption and is hard to solve due to the limited training data. To overcome such issues, we revisit the training process of model adaption and devise hypothetical experiments. Our research has found that current adaption methods fail to fully use the learned knowledge in the source generator. Meanwhile, in the all-parameter training mode, parameters independent of the source domain are also fine-tuned when fitting the few-shot target samples. In order to circumvent such issues, we propose the optimal kernel modulation method for effective few-shot generative model adaption. The idea of optimal transport theory is leveraged to measure the importance of model parameters for knowledge preservation and transfer. Meanwhile, to realize the control of parameter optimization, we adopt the parameter-efficient kernel modulation method according to its importance. Extensive quantitative and qualitative experiments prove the effectiveness and superiority of our method.
External IDs:dblp:journals/tcsv/ZhangPWJYD25
Loading