UncertaINR: Uncertainty Quantification of End-to-End Implicit Neural Representations for Computed Tomography

Published: 09 Apr 2023, Last Modified: 09 Apr 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Implicit neural representations (INRs) have achieved impressive results for scene reconstruction and computer graphics, where their performance has primarily been assessed on reconstruction accuracy. As INRs make their way into other domains, where model predictions inform high-stakes decision-making, uncertainty quantification of INR inference is becoming critical. To that end, we study a Bayesian reformulation of INRs, UncertaINR, in the context of computed tomography, and evaluate several Bayesian deep learning implementations in terms of accuracy and calibration. We find that they achieve well-calibrated uncertainty, while retaining accuracy competitive with other classical, INR-based, and CNN-based reconstruction techniques. Contrary to common intuition in the Bayesian deep learning literature, we find that INRs obtain the best calibration with computationally efficient Monte Carlo dropout, outperforming Hamiltonian Monte Carlo and deep ensembles. Moreover, in contrast to the best-performing prior approaches, UncertaINR does not require a large training dataset, but only a handful of validation images.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Thank you for the reviews, feedback, and decision! For the camera-ready version of the paper, we have updated the manuscript to the final format. We have also created a clean code repository, hosted on Github (https://github.com/bobby-he/uncertainr). Finally, we recorded a talk about the paper, which is hosted on Youtube (https://youtu.be/cD7Wx4F_EjQ).
Video: https://youtu.be/cD7Wx4F_EjQ
Code: https://github.com/bobby-he/uncertainr
Assigned Action Editor: ~Matthew_Blaschko1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 501
Loading