Abstract: Prior works on supervised summarization are mainly based on end-to-end models, leading to low modularity, unfaithfulness and low interpretability. To address this, we propose a new three-phase modular abstractive sentence summarization method.We split up the summarization problem explicitly into three stages, namely knowledge extraction, content selection and rewriting.We utilize multiple knowledge extractors to obtain relation triples from the text, learn a fine-tuned classifier to select content to be included in the summary and use a fine-tuned BART rewriter to rewrite the selected triples into a natural language summary.We find our model shows good modularity as the modules can be trained separately and on different datasets. The automatic and human evaluations demonstrate that our new method is competitive with state-of-the-art methods and more faithful than end-to-end baseline models.
Paper Type: short
0 Replies
Loading