LiteTracker: Leveraging Temporal Causality for Accurate Low-Latency Tissue Tracking

Published: 01 Jan 2025, Last Modified: 05 Nov 2025MICCAI (10) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Tissue tracking plays a critical role in various surgical navigation and extended reality (XR) applications. While current methods trained on large synthetic datasets achieve high tracking accuracy and generalize well to endoscopic scenes, their runtime performances fail to meet the low-latency requirements necessary for real-time surgical applications. To address this limitation, we propose LiteTracker, a low-latency method for tissue tracking in endoscopic video streams. LiteTracker builds on a state-of-the-art long-term point tracking method, and introduces a set of training-free runtime optimizations. These optimizations enable online, frame-by-frame tracking by leveraging a temporal memory buffer for efficient feature re-use and utilizing prior motion for accurate track initialization. LiteTracker demonstrates significant runtime improvements being around \(7\times \) faster than its predecessor and \(2\times \) than the state-of-the-art. Beyond its primary focus on efficiency, LiteTracker delivers high-accuracy tracking and occlusion prediction, performing competitively on both the STIR and SuPer datasets. We believe LiteTracker is an important step toward low-latency tissue tracking for real-time surgical applications in the operating room. Our code is publicly available at https://github.com/ImFusionGmbH/lite-tracker.
Loading