everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
Class-Incremental Learning (CIL) is a practical and challenging problem for achieving general artificial intelligence. Pre-Trained Models (PTMs) have recently led to breakthroughs in both visual and natural language processing (NLP) tasks. Despite recent studies showing PTMs' potential ability to learn sequentially, a plethora of work indicates the necessity of alleviating the catastrophic forgetting of PTMs. Through a pilot study and a causal analysis of CIL, we reveal that the problem lies in the imbalance effect between new and old data, which leads to the forgetting of classifiers. To alleviate this problem, we propose BaCE, a method retrieving the causal effects from new data to the adaptation of old classes and from old data to the adaptation of new classes. By balancing the causal effect, BaCE enables the causal effects from new and old data mutually help the adaptation to each class. We conduct extensive experiments on three different tasks (Image Classification, Text Classification, and Named Entity Recognition) with various backbones (ResNet-18, ViT, BERT) in the CIL setting. Empirical results show the proposed method outperforms a series of CIL methods on different tasks and settings.