Predicting 3D forearm fracture angle from biplanar Xray images with rotational bone pose estimation

31 Jan 2024 (modified: 21 Mar 2024)MIDL 2024 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Fracture angle measurement, 3D reconstruction
Abstract: Two-dimensional X-ray images, while widely used, have limitations to reflect 3D information of the imaged objects. Several studies have tried to recover such information from multiple X-ray images of the same object. Still, those approaches often fail due to the unrealistic assumption that the target does not move between views and those two views are perfectly orthogonal. A problem where 3D information would be highly valuable but is very difficult to assess from 2D X-ray images is the measurement of the actual 3D fracture angles in the forearm. To address this problem, we propose a deep learning-based method that predicts the rotational movement and skeletal posture from biplanar X-ray images, offering a novel and precise solution. Our strategy comprises the following steps: (1) automatic segmentation of the ulna and radius bones of the forearm on two X-ray images by a neural network; (2) prediction of the rotational parameters of the bones by a pose prediction network; (3) automatic detection of fracture locations and assessment of the fracture angles on 2D images; and (4) reconstruction of the real 3D fracture angle by inferring it from the 2D fracture information and the skeleton pose parameters collected from the two images. Our experiments on X-ray images show that our method can accurately measure 2D fracture angles and infer the pose of the forearm bones. By simulating X-ray images for various types of fractures, we show that our method could provide more accurate measurements of fracture angles in 3D. We are the first attempt for the fully automatic fracture angle measurements on both 2D and 3D versions, and we show the robustness of our method even in extreme cases where the two views are highly nonorthogonal.
Submission Number: 277
Loading