Neuromodulation Gated TransformerDownload PDF

01 Mar 2023 (modified: 12 Mar 2024)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: transformers, neuromodulation, question answering, natural language processing, natural language understanding
TL;DR: This paper introduces a novel approach entwining neuromodulation with transformers. We evaluate performance on the SuperGLUE benchmark and find that it results in the best average performance, although there is variation between datasets.
Abstract: We introduce a novel architecture, the Neuromodulation Gated Transformer (NGT), which is a simple implementation of neuromodulation in transformers via a multiplicative effect. We compare it to baselines and show that it results in the best average performance on the SuperGLUE benchmark validation sets.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2305.03232/code)
6 Replies

Loading