STF: A spherical transformer for versatile cortical surfaces applications

Jiale Cheng, Fenqiang Zhao, Zhengwang Wu, Xinrui Yuan, Li Wang, John H Gilmore, Weili Lin, Xin Zhang, Gang Li

Published: 01 Sept 2025, Last Modified: 19 Oct 2025NeuroImageEveryoneRevisionsCC BY-SA 4.0
Abstract: Highlights•We present a novel approach by applying global and local self-attention mechanisms on the spherical manifold, enabling robust and effective learning on cortical surface data.•The Spherical Transformer extends seamlessly to longitudinal data with spatiotemporal self-attention, highlighting its versatility across different scenarios.•Our model demonstrates consistent, high-quality performance across surface-level, vertex-level, and longitudinal prediction tasks, outperforming state-of-the-art methods.•The Spherical Transformer provides a reliable benchmark in cortical surface data analysis, offering a robust foundation for further developments in the field.
Loading