Large-scale Graph Representation Learning of Dynamic Brain Connectome with Transformers

Published: 20 Oct 2023, Last Modified: 13 Dec 2023TGL Workshop 2023 ShortPaperEveryoneRevisionsBibTeX
Keywords: graph transformer, functional connectivity, temporal graph learning, fmri, neuroimaging
TL;DR: We propose a Graph Transformer based method to learn dynamic functional connectivity which shows state-of-the-art performance on large-scale fMRI benchmarks.
Abstract: Graph Transformers have recently been successful in various graph representation learning tasks, providing a number of advantages over message-passing Graph Neural Networks. Utilizing Graph Transformers for learning the representation of the brain functional connectivity network is also gaining interest. However, studies to date have underlooked the temporal dynamics of functional connectivity, which fluctuates over time. Here, we propose a method for learning the representation of dynamic functional connectivity with Graph Transformers. Specifically, we define the connectome embedding, which holds the position, structure, and time information of the functional connectivity graph, and use Transformers to learn its representation across time. We perform experiments with over 50,000 resting-state fMRI samples obtained from three datasets, which is the largest number of fMRI data used in studies by far. The experimental results show that our proposed method outperforms other competitive baselines in gender classification and age regression tasks based on the functional connectivity extracted from the fMRI data.
Supplementary Material: pdf
Format: Long paper, up to 8 pages. If the reviewers recommend it to be changed to a short paper, I would be willing to revise my paper to fit within 4 pages.
Submission Number: 51