Keywords: image restoration, efficiency, network quantization, image quality, radiomics
TL;DR: An 3D denoising network for low dose CT images that exploits spatial-temporal correlation and 8-bit quantization to improve inferencing and training speed.
Abstract: While deep-learning-based imaging denoising techniques can improve the quality of low-dose computed tomography (CT) scans, repetitive 3D convolution operations cost significant computation resources and time. We present an efficient and accurate spatial-temporal convolution method to accelerate an existing denoising network based on the SRResNet. We trained and evaluated our model on our dataset containing 184 low-dose chest CT scans. We compared the performance of the proposed spatial-temporal convolution network to the SRResNet with full 3D convolutional layers. Using 8-bit quantization, we demonstrated a 7-fold speed-up during inference. Using lung nodule characterization as a driving task, we analyzed the impact on image quality and radiomic features. Our results show that our method achieves better perceptual quality, and the outputs are consistent with the SRResNet baseline outputs for some radiomics features (31 out of 57 total features). These observations together demonstrate that the proposed spatial-temporal method can be potentially useful for clinical applications where the computational resource is limited.
Paper Type: both
Primary Subject Area: Image Synthesis
Secondary Subject Area: Application: Radiology
Paper Status: original work, not submitted yet
Source Code Url: https://github.com/Litou1/STResNet
Data Set Url: We used our own dataset based on patients at our institute. Due to the policy, access has not been made public at the time of submission.
Registration: I acknowledge that publication of this at MIDL and in the proceedings requires at least one of the authors to register and present the work during the conference.
Authorship: I confirm that I am the author of this work and that it has not been submitted to another publication before.