Abstract: Highlights•We present a novel approach by applying global and local self-attention mechanisms on the spherical manifold, enabling robust and effective learning on cortical surface data.•The Spherical Transformer extends seamlessly to longitudinal data with spatiotemporal self-attention, highlighting its versatility across different scenarios.•Our model demonstrates consistent, high-quality performance across surface-level, vertex-level, and longitudinal prediction tasks, outperforming state-of-the-art methods.•The Spherical Transformer provides a reliable benchmark in cortical surface data analysis, offering a robust foundation for further developments in the field.
External IDs:doi:10.1016/j.neuroimage.2025.121370
Loading