Abstract: Dynamic functional brain network analysis using rs-fMRI has emerged as a powerful approach to understanding brain disorders. However, current methods predominantly focus on pairwise brain region interactions, neglecting critical high-order dependencies and time-varying communication mechanisms. To address these limitations, we propose the Long-Range High-Order Dependency Transformer (LHDFormer), a neurophysiologically-inspired framework that integrates multiscale long-range dependencies with time-varying connectivity patterns. Specifically, we present a biased random walk sampling strategy with NeuroWalk kernel-guided transfer probabilities that dynamically simulate multi-step information loss through a k-walk neuroadaptive factor, modeling brain neurobiological principles such as distance-dependent information loss and state-dependent pathway modulation. This enables the adaptive capture of the multi-scale short-range couplings and long-range high-order dependencies corresponding to different steps across evolving connectivity patterns. Complementing this, the time-varying transformer co-embeds local spatial configurations via topology-aware attention and global temporal dynamics through cross-window token guidance, overcoming the single-domain bias of conventional graph/transformer methods. Extensive experiments on ABIDE and ADNI datasets demonstrate that LHDFormer outperforms state-of-the-art methods in brain disease diagnosis. Crucially, the model identifies interpretable high-order connectivity signatures, revealing disrupted long-range integration patterns in patients that align with known neuropathological mechanisms.
External IDs:dblp:conf/miccai/XueHHZDG25
Loading