Effect of training fragment length on Transformers in text complexity predictionDownload PDF

01 Mar 2023 (modified: 30 May 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Text complexity classification, machine learning, transformers, bert
TL;DR: We studied the effects of using different text fragment lengthes on the performance and training time for models in the text complexity classfication task
Abstract: With the myriad practical applications of text complexity classification, it is important to optimize the training text fragment size for performance. We experiment with fine-tuning pre-trained BERT models to classify the complexity of Russian school text using different fragment sizes for training.
8 Replies

Loading