Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
INCREMENTAL LEARNING WITH PRE-TRAINED CONVOLUTIONAL NEURAL NETWORKS AND BINARY ASSOCIATIVE MEMORIES
Ghouthi Boukli Hacene, Vincent Gripon, Nicolas Farrugia, Mattieu Arzel, Michel Jezequel
Feb 17, 2017 (modified: Mar 15, 2017)ICLR 2017 workshop submissionreaders: everyone
Abstract:Thanks to their ability to absorb large amounts of data, Convolutional Neural Networks (CNNs) have become the state-of-the-art in various vision challenges, sometimes even on par with biological vision. CNNs rely on optimisation routines that typically require intensive computational power, thus the question of implementing CNNs on embedded architectures is a very active field of research. Of particular interest is the problem of incremental learning, where the device adapts to new observations or classes. To tackle this challenging problem, we propose to combine pre-trained CNNs with Binary Associative Memories, using product random sampling as an intermediate between the two methods. The obtained architecture requires significantly less computational power and memory usage than existing counterparts. Moreover, using various challenging vision datasets we
show that the proposed architecture is able to perform one-shot learning – even using only part of the dataset –, while keeping very good accuracy.
TL;DR:Combining associative memories with quantized outputs of a Deep CNN enables lightweight one shot incremental learning.
Keywords:Computer vision, Deep learning, Supervised Learning, Transfer Learning
Enter your feedback below and we'll get back to you as soon as possible.