RadGraph: Extracting Clinical Entities and Relations from Radiology ReportsDownload PDF

Jun 08, 2021 (edited Aug 29, 2021)NeurIPS 2021 Datasets and Benchmarks Track (Round 1)Readers: Everyone
  • Keywords: natural language processing, radiology, entity and relation extraction, multi-modal, graph
  • TL;DR: We present RadGraph, a dataset of entities and relations in radiology reports based on a novel information extraction schema.
  • Abstract: Extracting structured clinical information from free-text radiology reports can enable the use of radiology report information for a variety of critical healthcare applications. In our work, we present RadGraph, a dataset of entities and relations in full-text chest X-ray radiology reports based on a novel information extraction schema we designed to structure radiology reports. We release a development dataset, which contains board-certified radiologist annotations for 500 radiology reports from the MIMIC-CXR dataset (14,579 entities and 10,889 relations), and a test dataset, which contains two independent sets of board-certified radiologist annotations for 100 radiology reports split equally across the MIMIC-CXR and CheXpert datasets. Using these datasets, we train and test a deep learning model, RadGraph Benchmark, that achieves a micro F1 of 0.82 and 0.73 on relation extraction on the MIMIC-CXR and CheXpert test sets respectively. Additionally, we release an inference dataset, which contains annotations automatically generated by RadGraph Benchmark across 220,763 MIMIC-CXR reports (around 6 million entities and 4 million relations) and 500 CheXpert reports (13,783 entities and 9,908 relations) with mappings to associated chest radiographs. Our freely available dataset can facilitate a wide range of research in medical natural language processing, as well as computer vision and multi-modal learning when linked to chest radiographs.
  • Supplementary Material: zip
  • URL: https://doi.org/10.13026/hm87-5p47
11 Replies

Loading