Abstract: We introduce a semiparametric approach to deep learning. Inspired by complementary learning systems theory in cognitive neuroscience, our approach combines elements of parametric and nonparametric learning by giving a neural network access to a differentiable nearest-neighbor algorithm. Analysis of our model suggests that it is able to leverage the respective advantages of nonparametric and parametric methods. Our model displays robustness to domain adaptation, rapid learning on limited training sets, and well-clustered embeddings while retaining the expressive power and generalization capabilities characteristic of parametric methods. We demonstrate that our model relies more heavily on nearest-neighbors information in early training but better approximates a purely parametric model as training progresses.
Keywords: deep learning, nonparametric, episodic learning, nearest neighbors, complementary learning systems
TL;DR: Combining parametric and nonparametric methods aids domain transfer and fast learning.
4 Replies
Loading