Modifying memories in a Recurrent Neural Network UnitDownload PDF

15 Feb 2018 (modified: 10 Feb 2022)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: Long Short-Term Memory (LSTM) units have the ability to memorise and use long-term dependencies between inputs to generate predictions on time series data. We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights. This addition shows significant increases of performance on some of the tasks from the bAbI dataset.
TL;DR: Adding a new set of weights to the LSTM that rotate the cell memory improves performance on some bAbI tasks.
Keywords: LSTM, RNN, rotation matrix, long-term memory, natural language processing
Data: [bAbI](https://paperswithcode.com/dataset/babi-1)
6 Replies

Loading