Joint Supervised and Self-supervised Learning for MRI Reconstruction

Published: 27 Mar 2025, Last Modified: 19 May 2025MIDL 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: MRI Reconstruction, Inverse Problems, Deep Learning, Self-supervised Learning
TL;DR: JSSL combines supervised learning on proxy datasets with self-supervised learning on target datasets to improve MRI reconstruction when fully-sampled target data is unavailable.
Abstract: Magnetic Resonance Imaging (MRI) is a crucial modality but, its inherently slow acquisition process poses challenges in obtaining fully-sampled $k$-space data under motion. The lack of fully-sampled acquisitions, serving as ground truths, complicates the training of deep learning (DL) algorithms in a supervised manner. To address this limitation, self-supervised learning (SSL) methods have emerged as a viable alternative, leveraging available subsampled $k$-space data to train neural networks for MRI reconstruction. Nevertheless, these approaches often fall short when compared to supervised learning (SL). We propose Joint Supervised and Self-supervised Learning (JSSL), a novel training approach for DL-based MRI reconstruction algorithms aimed at enhancing reconstruction quality in cases where target datasets containing fully-sampled $k$-space measurements are unavailable. JSSL operates by simultaneously training a model in a SSL setting, using subsampled data from the target dataset(s), and in a SL manner, utilizing proxy datasets with fully-sampled $k$-space data. We demonstrate JSSL's efficacy using two distinct combinations of target and proxy data. Quantitative and qualitative results showcase substantial improvements over conventional SSL methods. Furthermore, we provide "rule-of-thumb" guidelines for training MRI reconstruction models. Our code is available at https://github.com/NKI-AI/direct.
Primary Subject Area: Image Acquisition and Reconstruction
Secondary Subject Area: Transfer Learning and Domain Adaptation
Paper Type: Methodological Development
Registration Requirement: Yes
Reproducibility: https://github.com/NKI-AI/direct
Midl Latex Submission Checklist: Ensure no LaTeX errors during compilation., Created a single midl25_NNN.zip file with midl25_NNN.tex, midl25_NNN.bib, all necessary figures and files., Includes \documentclass{midl}, \jmlryear{2025}, \jmlrworkshop, \jmlrvolume, \editors, and correct \bibliography command., Did not override options of the hyperref package, Did not use the times package., All authors and co-authors are correctly listed with proper spelling and avoid Unicode characters., Author and institution details are de-anonymized where needed. All author names, affiliations, and paper title are correctly spelled and capitalized in the biography section., References must use the .bib file. Did not override the bibliographystyle defined in midl.cls. Did not use \begin{thebibliography} directly to insert references., Tables and figures do not overflow margins; avoid using \scalebox; used \resizebox when needed., Included all necessary figures and removed *unused* files in the zip archive., Removed special formatting, visual annotations, and highlights used during rebuttal., All special characters in the paper and .bib file use LaTeX commands (e.g., \'e for é)., Appendices and supplementary material are included in the same PDF after references., Main paper does not exceed 9 pages; acknowledgements, references, and appendix start on page 10 or later.
Latex Code: zip
Copyright Form: pdf
Submission Number: 64
Loading