INCREMENTAL LEARNING WITH PRE-TRAINED CONVOLUTIONAL NEURAL NETWORKS AND BINARY ASSOCIATIVE MEMORIESDownload PDF

20 Apr 2024 (modified: 15 Mar 2017)ICLR 2017 workshop submissionReaders: Everyone
Abstract: Thanks to their ability to absorb large amounts of data, Convolutional Neural Networks (CNNs) have become the state-of-the-art in various vision challenges, sometimes even on par with biological vision. CNNs rely on optimisation routines that typically require intensive computational power, thus the question of implementing CNNs on embedded architectures is a very active field of research. Of particular interest is the problem of incremental learning, where the device adapts to new observations or classes. To tackle this challenging problem, we propose to combine pre-trained CNNs with Binary Associative Memories, using product random sampling as an intermediate between the two methods. The obtained architecture requires significantly less computational power and memory usage than existing counterparts. Moreover, using various challenging vision datasets we show that the proposed architecture is able to perform one-shot learning – even using only part of the dataset –, while keeping very good accuracy.
TL;DR: Combining associative memories with quantized outputs of a Deep CNN enables lightweight one shot incremental learning.
Conflicts: imt-atlantique.fr
Keywords: Computer vision, Deep learning, Supervised Learning, Transfer Learning
5 Replies

Loading