mT5: A Massively Multilingual Pre-trained Text-to-Text TransformerDownload PDFOpen Website

2021 (modified: 15 Nov 2021)NAACL-HLT 2021Readers: Everyone
Abstract: Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021.
0 Replies

Loading