A Multi-scale Graph Network with Multi-head Attention for Histopathology Image DiagnosisDownload PDF

Published: 25 Aug 2021, Last Modified: 05 May 2023COMPAY 2021Readers: Everyone
Keywords: Whole slide image classification, Graph convolution, Attention.
Abstract: Hematoxylin-eosin (H&E) staining plays an essential role in brain glioma diagnosis, but reading pathologic images and generating diagnostic reports can be a tedious and laborious work. Pathologists need to combine and navigate extremely large images with different scales and to quantify different aspects for subtyping. In this work, we propose an automatic diagnosis algorithm to identify cell types and severity of H&E slides, in order to classify five major subtypes of glioma from whole slide pathological images. The proposed method is featured by a pyramid graph structure and an attention-based multi-instance learning strategy. We claim that our method not only improve the classification accuracy by utilizing multi-scale information, but also help to identify high risk patches. We summarized patches from multiple resolutions into a graph structure. The nodes of the pyramid graph are feature vectors extracted from image patches, and these vectors are connected by their spatial adjacency. We then fed the graph into the proposed model with self-attention and graph convolutions. Here, we used a multi-head self-attention architecture, where same self-attention blocks are stacked in parallel. As proven in Transformer networks, multiple attention maps herein capture comprehensive activation patterns from different subspace representation. Using the proposed method, the results show a 71% accuracy for glioma subtyping. The multiresolution attention maps generated from the proposed method could help locate proliferations and necrosis in the whole pathologic slide.
3 Replies

Loading