Modifying memories in a Recurrent Neural Network Unit

Anonymous

Nov 03, 2017 (modified: Nov 03, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Long Short-Term Memory (LSTM) units have the ability to memorise and use long-term dependencies between inputs to generate predictions on time series data. We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights. This addition shows significant increases of performance on some of the tasks from the bAbI dataset.
  • TL;DR: Adding a new set of weights to the LSTM that rotate the cell memory improves performance on some bAbI tasks.
  • Keywords: LSTM, RNN, rotation matrix, long-term memory, natural language processing

Loading