LEA: Learning Latent Embedding Alignment Model for fMRI Decoding and Encoding

Published: 18 Dec 2024, Last Modified: 18 Dec 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: The connection between brain activity and visual stimuli is crucial to understanding the human brain. Although deep generative models have shown advances in recovering brain recordings by generating images conditioned on fMRI signals, it is still challenging to generate consistent semantics. Moreover, predicting fMRI signals from visual stimuli remains a hard problem. In this paper, we introduce a unified framework that addresses both fMRI decoding and encoding. We train two latent spaces to represent and reconstruct fMRI signals and visual images, respectively. By aligning these two latent spaces, we seamlessly transform between the fMRI signal and visual stimuli. Our model, called Latent Embedding Alignment (LEA), can recover visual stimuli from fMRI signals and predict brain activity from images. LEA outperforms existing methods on multiple fMRI decoding and encoding benchmarks. It offers a comprehensive solution for modeling the relationship between fMRI signals and visual stimuli. The codes are available at \url{https://github.com/naiq/LEA}.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Dear Action Editor, Thanks for your effort and attention to our submission. Now, we submit our final version, including valuable comments suggested by you and all reviewers. Briefly, - We have proofread the paper carefully, and fixed grammar and clerical errors. In addition, we have re-polished representations in the abstract, Sec. 3.1, 3.4, and 6, and added descriptions of fMRI preprocessing procedures in Sec. A.2. - We have included comparison results with (Vodrahalli et al., 2018; Ozcelik et al., 2022) in Tab. 3. - We have provided more discussions and explanations about the study in Fig. 6. - We have cited all relevant papers as suggested by the reviewers. - We have released our codes at [https://github.com/naiq/LEA](https://github.com/naiq/LEA).
Code: https://github.com/naiq/LEA
Assigned Action Editor: ~Bertrand_Thirion1
Submission Number: 2365
Loading