Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Reserve Output Units for Deep Open-Set Learning
Tegan Maharaj, David Krueger
Feb 12, 2018 (modified: Feb 12, 2018)ICLR 2018 Workshop Submissionreaders: everyone
Abstract:Open-set learning poses a classification problem where the set of class labels expands over time; a realistic but not widely-studied setting.
We propose a deep learning technique for open-set learning based on Reserve Output Units (ROUs), which are designed to help a network anticipate the introduction of new categories during training.
ROUs are additional output units whose representations are trained along with units for already-seen classes, which can be assigned to new classes once a labeled instance of a novel class is observed. We experiment with different initialization methods, compare this method with simply adding an new output vector for the novel class, and find that ROUs achieve better and more consistent performance than the simple add-new baseline.
We further experiment with a loss which encourages the output space to match pretrained word embeddings, with the goal of encouraging good semantics of this space.
In experiments on MNIST and CIFAR, this technique hurt or does not affect performance, but we're optimistic this technique could be helpful for larger output spaces.
TL;DR:Reserve output units allow deep networks to anticipate the introduction of novel classes
Keywords:Open set learning, deep learning, online learning
Enter your feedback below and we'll get back to you as soon as possible.