Simplified Rewriting Improves Expert Summarization

ACL ARR 2025 July Submission1075 Authors

29 Jul 2025 (modified: 20 Aug 2025)ACL ARR 2025 July SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Radiology report summarization (RRS) is critical for clinical workflows, requiring concise ``Impressions'' distilled from detailed ``Findings.'' This paper proposes a novel prompting strategy that enhances RRS by introducing a layperson summary as an intermediate step. This summary helps normalize key observations and simplify complex terminology using communication techniques inspired by doctor–patient interactions. Combined with few-shot in-context learning, this approach improves the model’s ability to map generalized descriptions to specific clinical findings. We evaluate our method on three benchmark datasets, MIMIC-CXR, CheXpert, and MIMIC-III, and compare it against state-of-the-art open-source language models in the 7B/8B parameter range, such as Llama-3.1-8B-Instruct. Results show consistent improvements in summarization quality, with gains of up to 5\% on some metrics for prompting, and more than 20\% for some models when instruction tuning.
Paper Type: Long
Research Area: Summarization
Research Area Keywords: Summarization, Radiology Reports, Biomedical NLP, In-Context Learning, Prompt Engineering, Clinical Text
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Data analysis
Languages Studied: English
Submission Number: 1075
Loading