GBT: Geometric-oriented Brain Transformer for Autism Diagnosis

Published: 22 Oct 2024, Last Modified: 22 Feb 2026Medical Image Computing and Computer Assisted Intervention – MICCAI 2024EveryoneRevisionsCC BY 4.0
Abstract: Human brains are typically modeled as networks of Regions of Interest (ROI) to comprehend brain functional Magnetic Resonance Imaging (fMRI) connectome for Autism diagnosis. Recently, various deep neural network-based models have been developed to learn the representation of ROIs, achieving impressive performance improvements. However, they (i) heavily rely on increasingly complex network architecture with an obscure learning mechanism, or (ii) solely utilize the crossentropy loss to supervise the training process, leading to sub-optimal performance. To this end, we propose a simple and effective Geometricoriented Brain Transformer (GBT) with the Attention Weight Matrix Approximation (AWMA)-based transformer module and the geometricoriented representation learning module for brain fMRI connectome analysis. Specifically, the AWMA-based transformer module selectively removes the components of the attention weight matrix with smaller singular values, aiming to learn the most relevant and representative graph representation. The geometric-oriented representation learning module imposes low-rank intra-class compactness and high-rank inter-class diversity constraints on learned representations to promote that to be discriminative. Experimental results on the ABIDE dataset validate that our method GBT consistently outperforms state-of-the-art approaches. The code is available at https://github.com/CUHK-AIM-Group/GBT.
Loading