FactoScalpel: Enhancing the Factual Consistency of Abstractive Summarization through Knowledge InjectionDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Recently, abstractive summarization has problems with factual inconsistencies in generated summaries. Inspired by the related work on knowledge storage in Transformer, we firstly explore the relationship between factual errors and Feed-Forward Networks (FFNs) in Transformer, and propose factual errors attribution method. Based on the results, we inject knowledge to the decoder for the first time, propose a fact-aware summarization model FactoScalpel which integrates a Knowledge Bank and router-controlled mechanism into FFNs. By introducing facts through Knowledge Bank, balancing the original FNN with the newly added Knowledge Bank module through router-controlled mechanism, FactoScalpel achieves factual improvement of the decoder end through fine surgery. We compare FactoScalpel with multiple fact-aware summarization models using multiple factual consistency metrics based on the XSum, our method achieves state-of-the-art results in most experiments.
Paper Type: long
Research Area: Summarization
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview