Combining Paraphrase Pre-trained Model and Controllable Rules for Unsupervised Sentence SimplificationDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Although neural sequence-to-sequence models for sentence simplification achieve some progress, they still suffer from the data sparsity problem and are lack of controllability. This paper proposes a two-stage approach for text simplification. First, considering text simplification is closely related to text summarization and paraphrase, we fine-tune the pre-trained model on the dataset of summarization and paraphrase. Further, in order to achieve interpretation and controllablity, we design controllable scorers to evaluate the simplified sentence from three aspects: adequacy, fluency and simplicity, which are applied to sort the generated sentences and output the best one. Experiments show that our approach improves the previous best performance of the unsupervised model by a considerable margin of 5.53 points, achieving a new state-of-the-art result. Our method even performs competitively with supervised models in both automatic metrics and human evaluation.
0 Replies

Loading