Attentive Recurrent ComparatorsDownload PDF

29 Nov 2024 (modified: 22 Oct 2023)Submitted to ICLR 2017Readers: Everyone
Abstract: Attentive Recurrent Comparators (ARCs) are a novel class of neural networks built with attention and recurrence that learn to estimate the similarity of a set of objects by cycling through them and making observations. The observations made in one object are conditioned on the observations made in all the other objects. This allows ARCs to learn to focus on the salient aspects needed to ascertain similarity. Our simplistic model that does not use any convolutions performs comparably to Deep Convolutional Siamese Networks on various visual tasks. However using ARCs and convolutional feature extractors in conjunction produces a model that is significantly better than any other method and has superior generalization capabilities. On the Omniglot dataset, ARC based models achieve an error rate of 1.5\% in the One-Shot classification task - a 2-3x reduction compared to the previous best models. This is also the first Deep Learning model to outperform humans (4.5\%) and surpass the state of the art accuracy set by the highly specialized Hierarchical Bayesian Program Learning (HBPL) system (3.3\%).
TL;DR: Attention and Recurrence can be as good as Convolution in some cases. Bigger returns when we combine all three.
Conflicts: csa.iisc.ernet.in
Keywords: Deep learning, Computer vision
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1703.00767/code)
7 Replies

Loading