Abstract: Accessing previous data when updating the model with new data is a common problem in some incremental learning applications. This prevents, for example, neural networks from suffering catastrophic forgetting. In this paper, we focus on the incrementing of NCMFs for which access to old data is required with classical incrementing strategies such as IGT. We propose a new incrementing strategy, named IGTLGSS, that allows these kind of random forests to continue to increment without relying on old data. For this purpose, the old data are replaced by synthetic data that are generated from the pre-trained NCMF which has to be incremented. Experimental studies are performed on UCI benchmarks. The results show that, for the used datasets, NCMFs are able to generate realistic synthetic data. Moreover, the first results obtained following the assessment of our incrementing strategy are encouraging.
External IDs:dblp:conf/ki/GonzalezDC23
Loading