Multiple Sequential Learning Tasks Represented in Recurrent Neural NetworksDownload PDF

24 Sept 2021, 16:11 (edited 30 Nov 2021)NeurIPS-AI4Science PosterReaders: Everyone
  • Keywords: Sequential learning, Multitask, Recurrent neural networks
  • TL;DR: We trained a single RNN to perform multiple sequential learning tasks at the same time, hoping to provide a computational platform to investigate the neural representations of cognitive sequential learning abilities.
  • Abstract: Our brain can flexibly perform a variety of sequential learning tasks including music, language, and mathematics, but the underlying mechanism hasn't been elucidated in traditional experimental and modeling studies which were designed for only one task at a time. From the computational perspective, we hypothesize that the working mechanism of a multitask model can provide a possible solution to that of brains. Therefore, we trained a single recurrent neural network to perform 8 sequential learning tasks that depend on working memory, structure extraction, categorization, and other cognitive processes. After training, the model can learn sophisticated information holding and erasing strategies to perform multitasks simultaneously. More interestingly, the model learns to reuse neurons to encode similar task features. Hopefully, this work can provide a computational platform to investigate the neural representations of cognitive sequential learning ability.
  • Track: Original Research Track
1 Reply

Loading