No Identity, no problem: Motion through detection for people tracking

Published: 25 Nov 2024, Last Modified: 25 Nov 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Tracking-by-detection has become the de facto standard approach to people tracking. To increase robustness, some approaches incorporate re-identification using appearance models and regressing motion offset, which requires costly identity annotations. In this paper, we propose exploiting motion clues while providing supervision only for the detections, which is much easier to do. Our algorithm predicts detection heatmaps at two different times, along with a 2D motion estimate between the two images. It then warps one heatmap using the motion estimate and enforces consistency with the other one. This provides the required supervisory signal on the motion without the need for any motion annotations. In this manner, we couple the information obtained from different images during training and increase accuracy, especially in crowded scenes and when using low frame-rate sequences. We show that our approach delivers state-of-the-art results for single- and multi-view multi-target tracking on the MOT17 and WILDTRACK datasets.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Camera ready version, we removed the review highlight in red.
Code: https://github.com/cvlab-epfl/noid-nopb
Supplementary Material: zip
Assigned Action Editor: ~David_Fouhey2
Submission Number: 3130
Loading