SHAPR Predicts 3D Cell Shapes from 2D Microscopic ImagesDownload PDF

Published: 09 May 2022, Last Modified: 12 May 2023MIDL 2022 Short PapersReaders: Everyone
Keywords: 3D shape prediction, stereology, single-cell morphometry, adversarial learning
TL;DR: We reconstruct 3D shapes of 2D confocal microscopy images of single cells and nuclei and solve this inverse problem with a novel deep learning SHApe PRediction autoencoder (SHAPR).
Abstract: Reconstructing shapes of three-dimensional (3D) objects from two-dimensional (2D) images is a challenging spatial reasoning task for both our brain and computer vision algorithms. We focus on solving this inverse problem with a novel deep learning SHApe PRediction autoencoder (SHAPR), and showcase its potential on 2D confocal microsopy images of single cells and nuclei. Our findings indicate that SHAPR reconstructs 3D shapes of red blood cells from 2D images more accurately than naïve stereological models and significantly increases the feature-based classification of red blood cell types. Applying it to 2D images of spheroidal aggregates of densely grown human induced pluripotent stem cells, we observe that SHAPR learns fundamental shape properties of cell nuclei and allows for prediction-based 3D morphometry. SHAPR can help to optimize and up-scale image-based high-throughput applications by reducing imaging time and data storage.
Registration: I acknowledge that acceptance of this work at MIDL requires at least one of the authors to register and present the work during the conference.
Authorship: I confirm that I am the author of this work and that it has not been submitted to another publication before.
Paper Type: recently published or submitted journal contributions
Primary Subject Area: Image Acquisition and Reconstruction
Secondary Subject Area: Segmentation
Confidentiality And Author Instructions: I read the call for papers and author instructions. I acknowledge that exceeding the page limit and/or altering the latex template can result in desk rejection.
Code And Data: Code and documentation: Data:
1 Reply