Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems

Colin Raffel, Daniel P. W. Ellis

Feb 12, 2016 (modified: Feb 12, 2016) ICLR 2016 workshop submission readers: everyone
  • Abstract: We propose a simplified model of attention which is applicable to feed-forward neural networks and demonstrate that the resulting model can solve the synthetic "addition" and "multiplication" long-term memory problems for sequence lengths which are both longer and more widely varying than the best published results for these tasks.
  • Conflicts: columbia.edu, google.com

Loading