Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Modifying memories in a Recurrent Neural Network Unit
Vlad Velici, Adam Prügel-Bennett
Feb 15, 2018 (modified: Feb 15, 2018)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:Long Short-Term Memory (LSTM) units have the ability to memorise and use long-term dependencies between inputs to generate predictions on time series data. We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights. This addition shows significant increases of performance on some of the tasks from the bAbI dataset.
TL;DR:Adding a new set of weights to the LSTM that rotate the cell memory improves performance on some bAbI tasks.
Keywords:LSTM, RNN, rotation matrix, long-term memory, natural language processing
Enter your feedback below and we'll get back to you as soon as possible.