Beyond Flesch-Kincaid: New Neural Metrics to Improve Difficulty Prediction for Educational TextsDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: We demonstrate the promise of neural metrics as a new class of features for evaluating text adaptation to different education levels.
Abstract: Using large language models (LLMs) for educational applications such as dialogue-based teaching is a hot topic. Effective teaching, however, requires adapting the difficulty of content and explanations to the education level of their students. Even the best LLMs today struggle to do this well. If we want to improve LLMs on this adaptation task, we need to be able to reliably measure adaptation success. However, current static metrics for text difficulty, like the Flesch-Kincaid Reading Ease score, are known to be crude and brittle. We therefore introduce and evaluate a new set of neural metrics for text difficulty. Based on a user study, we create neural metrics as LLM prompts that leverage the general language understanding capabilities of LLMs to capture more abstract and complex text features than static metrics. Through regression experiments, we show that our neural metrics improve text difficulty prediction over static metrics alone. Our results demonstrate the promise of neural metrics as a new class of features for evaluating text adaptation to different education levels.
Paper Type: long
Research Area: Resources and Evaluation
Contribution Types: Data resources
Languages Studied: English
0 Replies

Loading