Attentive Recurrent Comparators

Pranav Shyam, Ambedkar Dukkipati

Nov 05, 2016 (modified: Dec 28, 2016) ICLR 2017 conference submission readers: everyone
  • Abstract: Attentive Recurrent Comparators (ARCs) are a novel class of neural networks built with attention and recurrence that learn to estimate the similarity of a set of objects by cycling through them and making observations. The observations made in one object are conditioned on the observations made in all the other objects. This allows ARCs to learn to focus on the salient aspects needed to ascertain similarity. Our simplistic model that does not use any convolutions performs comparably to Deep Convolutional Siamese Networks on various visual tasks. However using ARCs and convolutional feature extractors in conjunction produces a model that is significantly better than any other method and has superior generalization capabilities. On the Omniglot dataset, ARC based models achieve an error rate of 1.5\% in the One-Shot classification task - a 2-3x reduction compared to the previous best models. This is also the first Deep Learning model to outperform humans (4.5\%) and surpass the state of the art accuracy set by the highly specialized Hierarchical Bayesian Program Learning (HBPL) system (3.3\%).
  • TL;DR: Attention and Recurrence can be as good as Convolution in some cases. Bigger returns when we combine all three.
  • Conflicts: csa.iisc.ernet.in
  • Keywords: Deep learning, Computer vision

Loading