Mamba-GINR: A Scalable Framework for Spatiotemporal Representation of fMRI

Published: 23 Sept 2025, Last Modified: 06 Dec 2025DBM 2025 Findings PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This work introduces Mamba-based generalizable implicit neural representations (GINRs) for functional MRI representation learning and reconstruction.
Abstract: Generalizable implicit neural representations (GINRs) are a powerful paradigm for modeling large-scale functional MRI (fMRI) data, but their adoption is blocked by a key modeling challenge. Prior GINRs built on Transformers cannot scale to 4D fMRI due to the quadratic complexity of attention, preventing promising applications like data compression, temporal interpolation, and representation learning for large-scale scientific data. This work introduces Mamba-GINR, a framework that leverages Mamba as a backbone aiming for linear-time scaling. Our results, benchmarked on standard image datasets (CIFAR-10, CelebA), show it achieves superior reconstruction quality. Critically, we demonstrate Mamba’s superior scalability on GINR: it significantly outperforms baselines given an identical token budget and was the only GINR variant that could successfully model sequences at a scale comparable to 4D fMRI data. Further analysis into the placement of learnable queries and the model's internal time delta ($\Delta$) parameter confirms its ability to create robust, high-fidelity representations. By addressing this critical modeling bottleneck, our work has the potential to make GINRs a more viable tool for fMRI analysis. This advance in scalability could enable the continuous representation of entire fMRI sessions, potentially preserving rich temporal dynamics that are often lost to computational constraints. We present this framework as a foundational tool and invite the neuroscience community to collaborate on applying it to explore complex, long-timescale brain activity in large-scale datasets.
Length: long paper (up to 8 pages)
Domain: methods
Author List Check: The author list is correctly ordered and I understand that additions and removals will not be allowed after the abstract submission deadline.
Anonymization Check: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and URLs that point to identifying information.
Submission Number: 32
Loading