MoFE: Mixture of Factual Experts for Controlling Hallucinations in Abstractive SummarizationDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Neural abstractive summarization models are susceptible to generating factually inconsistent content, a phenomenon known as hallucination. This limits the usability and adoption of these systems in real-world applications. To reduce the presence of hallucination, we propose the Mixture of Factual Experts (MoFE) model, which combines multiple summarization experts that each target a specific type of factual error. We construct MoFE by combining the experts using weights and logits ensembling strategies and find that the MoFE provides a modular approach to control different factual errors while maintaining performance on standard ROUGE metrics.
0 Replies

Loading