Towards Minimal Supervision BERT-based Grammar Error CorrectionDownload PDFOpen Website

2020 (modified: 14 Sept 2021)CoRR 2020Readers: Everyone
Abstract: Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.
0 Replies

Loading