BERT models for Brazilian Portuguese: Pretraining, evaluation and tokenization analysis

Published: 01 Jan 2024, Last Modified: 19 Feb 2025Appl. Soft Comput. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Release of pretrained BERT models for Brazilian Portuguese trained on the brWaC corpus.•State-of-the-art performance on three Portuguese NLP tasks: STS, RTE and NER.•Tokenization analysis reveals correlation between task performance and subword splits.
Loading