Comparing radiologists' gaze and saliency maps generated by interpretability methods for chest x-raysDownload PDF

Published: 20 Oct 2022, Last Modified: 10 Nov 2024Gaze Meets ML 2022 PosterReaders: Everyone
Keywords: Chest X-rays, Radiology, Eye Tracking, Gaze, Saliency Maps
TL;DR: We compare gaze data collected from radiologists with heatmaps generated by Grad-CAM and spatial attention maps, showing that a combination of both methods provides the best results.
Abstract: We use a dataset of eye-tracking data from five radiologists to compare the regions used by deep learning models for their decisions and the heatmaps representing where radiologists looked. We conduct a class-independent analysis of the saliency maps generated by two methods selected from the literature: Grad-CAM and attention maps from an attention-gated model. For the comparison, we use shuffled metrics, avoiding biases from fixation locations. We achieve scores comparable to an interobserver baseline in one metric, highlighting the potential of saliency maps from Grad-CAM to mimic a radiologist's attention over an image. We also divide the dataset into subsets to evaluate in which cases similarities are higher.
Submission Type: Extended Abstract
Travel Award - Academic Status: Ph.D. Student
Travel Award - Institution And Country: University of Utah, USA
Travel Award - Low To Lower-middle Income Countries: No, my institution does not qualify.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/comparing-radiologists-gaze-and-saliency-maps/code)
5 Replies

Loading