Robust Skin-Feature Tracking in Free-Hand Video from Smartphone or Robot-Held Camera, to Enable Clinical-Tool Localization and Guidance
Abstract: Our novel skin-feature visual-tracking algorithm
enables anatomic vSLAM and (by extension) localization of
clinical tools relative to the patient’s body. Tracking naturally
occurring features is challenging due to patient uniqueness,
deformability, and lack of an accurate a-priori 3D geometric
model. Our method (i) tracks skin features in a smartphone camera
video sequence, (ii) performs anatomic Simultaneous
Localization And Mapping (SLAM) of camera motion relative
to the patient’s 3D skin surface, and (iii) utilizes existing
visual methods to track clinical tool(s) relative to the patient’s
reconstructed 3D skin surface. (We demonstrate tracking of
a simulated ultrasound probe relative to the patient by using
an Apriltag visual fiducial). Our skin-feature tracking method
utilizes the Fourier-Mellin Transform for robust performance,
which we incorporated and extend an existing Phase Only
Correlation (POC) based algorithm to be suitable for our
application of free-hand smartphone video, wherein the distance
of the camera fluctuates relative to the patient. Our SLAM
approach further utilizes Structure from Motion and Bundle
Adjustment to achieve an accurate 3D model of the human
body with minimal drift-error in camera trajectory. We believe
this to be the first freehand smartphone-camera tracking of
natural skin features for anatomic tracking of surgical tools,
ultrasound probe, etc.
0 Replies
Loading