MedicalSum: A Guided Clinical Abstractive Summarization Model for Generating Medical Reports from Patient-Doctor ConversationsDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=qyxYfsostya
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: We introduce MedicalSum, a Transformer-based sequence-to-sequence architecture for summarizing medical conversations by integrating medical domain knowledge from the Unified Medical Language System (UMLS). The novel knowledge augmentation is performed in three ways: (i) introducing a guidance signal that consists of the medical words in the input sequence, (ii) leveraging semantic type knowledge in UMLS to create clinically meaningful input embeddings, and (iii) making use of a novel weighted loss function that provides a stronger incentive for the model to correctly predict words with a medical meaning. By applying these three strategies, MedicalSum takes clinical domain knowledge into consideration during the summarization process and achieves state-of-the-art ROUGE score improvements of 0.8-2 points (including 6.2% error reduction in PE section ROUGE-1) when producing medical summaries of patient-doctor conversations. Furthermore, a qualitative analysis shows that medical summaries produced by the knowledge augmented model contain more relevant clinical facts from the patient-doctor conversation.
0 Replies

Loading