Abstract: Recent advances in Binary Neural Networks (BNNs) are opening up new possibilities for disruptive hardware accelerators. This paper extends prior work on incremental learning to BNNs, by proposing a specifically-designed fully-binarized net-work and evaluating it on two learning variants, i.e., native and latent replay. The proposed BNN achieves a 53.3% test accuracy on the CIFAR-100 benchmark while relying on a binary-only arithmetic, for a 4.1Mb model size. Given a class-incremental learning experimental setup, we evaluate the influence of replay buffer size on the strategy, highlighting a turning point where latent replay offers a better classification performance than Native replay. In addition, our approach exhibits robustness against a large number of successive retrainings with an accuracy always 10% higher than a full-precision counterpart.
Loading