Keywords: Incremental learning, Domain-aware, EM algorithm
Abstract: Incremental learning is necessary to achieve human-like intelligence system since the model must continuously accumulate knowledge in response to real-world streaming data.
In this work, we consider a general and yet under-explored incremental learning problem in which both the class distribution and class-specific domain distribution vary over sequential sessions.
Apart from the challenges discussed extensively in the class incremental learning, the problem also faces an intra-class stability-plasticity dilemma and intra-class domain imbalance issue.
To address above issues, we develop a novel domain-aware learning framework.
Concretely, we introduce a flexible class representation based on the von Mises-Fisher mixture model to capture the intra-class structure as well as a bi-level balanced memory to deal with data imbalances within and between classes.
In particular, we build a mixture model on deep features of each class and devise an expansion-and-reduction strategy for dynamically increasing the number of components according to the concept complexity.
Combining with distillation loss, our design encourages the model to learn a domain-ware representation, which aids in achieving inter- and intra-class stability-plasticity trade-off.
We conduct exhaustive experiments on three benchmarks, each with three representative splits.
The results show that our method consistently outperforms other methods with a significant margin, suggesting its superiority.
One-sentence Summary: We develop a domain-aware representation based on mixture model and a bi-level balanced strategy for the rarely explored general incremental learning problem.
5 Replies
Loading