MambaMoE: Mixture-of-spectral-spatial-experts state space model for hyperspectral image classification

Yichu Xu, Di Wang, Hongzan Jiao, Lefei Zhang, Liangpei Zhang

Published: 01 Mar 2026, Last Modified: 21 Jan 2026Information FusionEveryoneRevisionsCC BY-SA 4.0
Abstract: Highlights•This paper introduces MambaMoE, a novel Mamba-based spectral-spatial mixture-of-experts framework for HSI classification. To the best of our knowledge, this is the first MoE-based deep network introduced in the HSI classification domain, enabling adaptive extraction of spectral-spatial joint features tailored to the diverse characteristics of land covers.•We design the Mixture of Mamba Expert Block (MoMEB), which integrates spatially routed experts and spectrally shared experts based on Mamba. Leveraging sparse expert activation, MoMEB facilitates dynamic learning of directional spectral-spatial features.•We introduce an uncertainty-guided corrective learning (UGCL) strategy to guide the model’s focus toward challenging regions. This approach mitigates prediction confusion arising from the directional modeling limitations of Mamba by adaptively refining feature representations in uncertain areas.•Extensive experiments on multiple HSI classification benchmarks demonstrate that our proposed method consistently outperforms existing state-of-the-art approaches in both accuracy and computational efficiency, thanks to the synergy of our architecture and training strategy.
Loading