- Keywords: Recurrent Neural Networks, Neural Circuits, Activation Manifolds, Deep Learning
- TL;DR: We investigate how a recurrent neural network successfully learns a task combining long-term memory and sequential recall.
- Abstract: We investigate the learned dynamical landscape of a recurrent neural network solving a simple task requiring the interaction of two memory mechanisms: long- and short-term. Our results show that while long-term memory is implemented by asymptotic attractors, sequential recall is now additionally implemented by oscillatory dynamics in a transverse subspace to the basins of attraction of these stable steady states. Based on our observations, we propose how different types of memory mechanisms can coexist and work together in a single neural network, and discuss possible applications to the fields of artificial intelligence and neuroscience.