Unsupervised Meta-Learning via Dynamic Head and Heterogeneous Task Construction for Few-Shot Classification
Meta-learning has been widely used in recent years in areas such as few-shot learning and reinforcement learning. However, the questions of why and when it's better than other algorithms in few-shot classification remain to be explored. In this paper, we answer the above questions from the perspective of data noise and heterogeneous tasks. Specifically, we perform pre-experiments by adjusting the proportion of data noise and the degree of heterogeneity of the task in the dataset. We use the metric of Singular Vector Canonical Correlation Analysis to quantified the representation stability of the neural network and thus to compare the behavior of meta-learning algorithms and other algorithms. We find that benefits from the bi-level optimization strategy, the meta-learning algorithm has better robustness to label noise and heterogeneous tasks. Based on the above conclusion, we argue a promising future for meta-learning in the unsupervised area, and thus propose DHM-UHT, a dynamic head meta-learning algorithm with unsupervised heterogeneous task construction. The core idea of DHM-UHT is to use DBSCAN and dynamic head to achieve heterogeneous task construction and meta-learn the whole process of unsupervised heterogeneous task construction. On several unsupervised zero-shot and few-shot datasets, DHM-UHT obtains state-of-the-art performance. The code is released at https://github.com/tuantuange/DHM-UHT.