Keywords: Synthetic Data, Patient Rotation Detection, Photon-counting CT, Chest X-ray
TL;DR: We propose the novel use of synthetic chest X-rays generated from 3D CT volumes for quantifying internal patient rotation, as lack of labelled CXR. This can inform technicians if and how re-exposure is needed without extensive image analysis.
Abstract: Deep learning has become a standard method for pattern recognition in medical images, but curation of large-scale annotated clinical data is challenging due to scarcity or ethical issues. Alternatively, synthetically generated data could supplementary be used to train neural networks. In this work, we propose the novel training scheme using synthetic chest X-rays generated from 3D photon-counting CT volumes for quantifying the internal patient rotation $\alpha$. This can automatically inform the technician if and how re-exposure is needed without the need of extensive image analysis. X-ray images were forward projected with a step size of 2$\degree$ rotation along patient axis. 1167 images and labels were trained on a modified DenseNet-121 to detect $\alpha$. Results on 252 test images showed good correlation between true and predicted $\alpha$, with $R^2$= 0.992, with 95% confidence level of $\approx \pm$2$\degree $.