Abstract: The long-tail distribution is prevalent in the real-world data, posing a challenge to the training of deep learning models. Previous methods predominantly alleviate the long-tail learning problem by reusing tail class samples. However, since reusing tail class samples does not improve the diversity of features in tail categories, it may result in overfitting on tail classes and suboptimal performance. To tackle this challenge, we propose a novel method that enhances the feature diversity of tail classes by transferring style information from head classes to tail classes. Our method comprises two key components: a style transfer module and a distribution alignment module. The style transfer module employs the fast Fourier transform to achieve the transfer of style information from head to tail classes and applies a hierarchical transfer strategy at different levels of the network to maximize the utilization of style features. The distribution alignment module leverages contrastive learning to ensure the consistency of the feature distribution of tail classes before and after style transfer. Our approach is highly flexible and versatile, serving as a plug-and-play module that can be integrated into long-tail learning models with ResNet and its derivative networks as the backbone, such as BBN and GLMC. Experimental results show that our approach significantly enhances model performance, especially in tail classes, on CIFAR100/10-LT, ImageNet-LT, and iNaturalist 2018 datasets.
External IDs:dblp:journals/cluster/HuCCDH25
Loading