Keywords: continual learning, domain incremental learning, neural network software repair, batch normalization
TL;DR: We study one-shot domain incremental learning (ODIL), where only one sample from a new domain is given, and adjust batch normalization for ODIL.
Abstract: Domain incremental learning (DIL) has been discussed in previous studies on deep neural network models for classification. In practice, however, we may encounter a situation where we need to perform DIL under the constraint that the samples on the new domain are observed only infrequently. In this study, we consider the extreme case where we have only one sample from the new domain, which we call one-shot DIL (ODIL). In simulation experiments on ODIL, we observed that the accuracy on both the new domain and the original domain deteriorated even on applying existing DIL methods. We analyzed the reason for this problem through various investigations and discovered that the cause would be the statistics of the batch normalization layers. According to our analysis, we propose a new technique regarding these statistics and demonstrate the effectiveness of the proposed method in ODIL through experiments on open datasets.
Submission Number: 5
Loading