Learning to Generate Explanation from e-Hospital Services for Medical Suggestion

Published: 01 Jan 2022, Last Modified: 30 Sept 2024COLING 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Explaining the reasoning of neural models has attracted attention in recent years. Providing highly-accessible and comprehensible explanations in natural language is useful for humans to understand model’s prediction results. In this work, we present a pilot study to investigate explanation generation with a narrative and causal structure for the scenario of health consulting. Our model generates a medical suggestion regarding the patient’s concern and provides an explanation as the outline of the reasoning. To align the generated explanation with the suggestion, we propose a novel discourse-aware mechanism with multi-task learning. Experimental results show that our model achieves promising performances in both quantitative and human evaluation.
Loading