N-grammer: Augmenting Transformers with latent n-gramsDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there has been significant recent interest and investment in scaling these models. However, the training and inference costs of these large Transformer language models are prohibitive, thus necessitating more research in identifying more efficient variants. In this work, we propose a simple yet effective modification to the Transformer architecture inspired by the literature in statistical language modeling, by augmenting the model with n-grams constructed from a discrete latent representation of the text sequence. We evaluate our model, the N-grammer on language modeling on the C4 data-set, and find that it outperforms several strong baselines such as the Transformer and the Primer. We will open-source our model for reproducibility purposes upon acceptance.
0 Replies

Loading