A Comparison of Transformer-Based Language Models on NLP BenchmarksOpen Website

2022 (modified: 21 Jan 2023)NLDB 2022Readers: Everyone
Abstract: Since the advent of BERT, Transformer-based language models (TLMs) have shown outstanding effectiveness in several NLP tasks. In this paper, we aim at bringing order to the landscape of TLMs and their performance on important benchmarks for NLP. Our analysis sheds light on the advantages that some TLMs take over the others, but also unveils issues in making a complete and fair comparison in some situations.
0 Replies

Loading