Meta-Learning Neural Bloom Filters

Jack W Rae, Sergey Bartunov, Timothy P Lillicrap

Sep 27, 2018 ICLR 2019 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: There has been a recent trend in training neural networks to replace data structures that have been crafted by hand, with an aim for faster execution, better accuracy, or greater compression. In this setting, a neural data structure is instantiated by training a network over many epochs of its inputs until convergence. In many applications this expensive initialization is not practical, for example streaming algorithms --- where inputs are ephemeral and can only be inspected a small number of times. In this paper we explore the learning of approximate set membership over a stream of data in one-shot via meta-learning. We propose a novel memory architecture, the Neural Bloom Filter, which we show to be more compressive than Bloom Filters and several existing memory-augmented neural networks in scenarios of skewed data or structured sets.
  • Keywords: meta-learning, memory, one-shot learning, bloom filter, set membership, familiarity, compression
  • TL;DR: We investigate the space efficiency of memory-augmented neural nets when learning set membership.
0 Replies

Loading