MedFuncta: A Unified Framework for Learning Efficient Medical Neural Fields

Published: 14 Feb 2026, Last Modified: 05 Mar 2026MIDL 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Generalizable Neural Fields, Implicit Neural Representations, Meta-Learning
TL;DR: MedFuncta efficiently learns continuous neural representations of diverse medical signals using meta-learned shared networks conditioned by compact latent vectors.
Abstract: Research in medical imaging primarily focuses on discrete data representations that poorly scale with grid resolution and fail to capture the often continuous nature of the underlying signal. Neural Fields (NFs) offer a powerful alternative by modeling data as continuous functions. While single-instance NFs have successfully been applied in medical contexts, extending them to large-scale medical datasets remains an open challenge. We therefore introduce **MedFuncta**, a unified framework for large-scale NF training on diverse medical signals. Building on Functa, our approach encodes data into a unified representation, namely a 1D latent vector, that modulates a shared, meta-learned NF, enabling generalization across a dataset. We revisit common design choices, introducing a non-constant frequency parameter $\omega$ in widely used SIREN activations, and establish a connection between this $\omega$-schedule and layer-wise learning rates, relating our findings to recent work in theoretical learning dynamics. We additionally introduce a scalable meta-learning strategy for shared network learning that employs sparse supervision during training, thereby reducing memory consumption and computational overhead while maintaining competitive performance. Finally, we evaluate MedFuncta across a diverse range of medical datasets and show how to solve relevant downstream tasks on our neural data representation. To promote further research in this direction, we release our code, model weights and the first large-scale dataset - **MedNF** - containing > 500 k latent vectors for multi-instance medical NFs. The project page is available at: https://pfriedri.github.io/medfuncta-io.
Primary Subject Area: Unsupervised Learning and Representation Learning
Secondary Subject Area: Application: Other
Registration Requirement: Yes
Reproducibility: https://github.com/pfriedri/medfuncta/
Visa & Travel: No
Read CFP & Author Instructions: Yes
Originality Policy: Yes
Single-blind & Not Under Review Elsewhere: Yes
LLM Policy: Yes
Midl Latex Submission Checklist: Ensure no LaTeX errors during compilation., Created a single midl26_NNN.zip file with midl26_NNN.tex, midl26_NNN.bib, all necessary figures and files., Includes \documentclass{midl}, \jmlryear{2026}, \jmlrworkshop, \jmlrvolume, \editors, and correct \bibliography command., Did not override options of the hyperref package, Did not use the times package., All authors and co-authors are correctly listed with proper spelling and avoid Unicode characters., Author and institution details are de-anonymized where needed. All author names, affiliations, and paper title are correctly spelled and capitalized in the biography section., References must use the .bib file. Did not override the bibliographystyle defined in midl.cls. Did not use \begin{thebibliography} directly to insert references., Tables and figures do not overflow margins; avoid using \scalebox; used \resizebox when needed., Included all necessary figures and removed *unused* files in the zip archive., Removed special formatting, visual annotations, and highlights used during rebuttal., All special characters in the paper and .bib file use LaTeX commands (e.g., \'e for é)., Appendices and supplementary material are included in the same PDF after references., Main paper does not exceed 12 pages; acknowledgements, references, and appendix start on page 13 or later.
Latex Code: zip
Copyright Form: pdf
Submission Number: 8
Loading