Syn-Tiger-360: Synthesizing 360° Biometric Representations for Tiger Re-Identification

Published: 27 Jan 2026, Last Modified: 02 Mar 2026AAAI 2026 AI4ES PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Biometric Representations, Animal Re-identification, Synthesized Dataset
Abstract: Individual tiger re-identification using camera traps is essential for effective, non-invasive wildlife monitoring. However, severe data scarcity and quality issues, such as sparse views, occlusions, and lighting variations, result in insufficient data for training robust re-identification models. While synthetic data generation offers a promising solution to training data scarcity, traditional 2D generative models (e.g., Stable Diffusion) fail to accurately capture both tiger pose and their surface-asymmetric stripe patterns. This leads to inconsistent biometric representations across different viewing angles. We introduce a novel framework leveraging image-to-360° video foundation models to synthesize rotation-consistent volumetric tiger biometrics. We present \textit{Syn-Tiger-360} — the first synthetic dataset for animal re-identification, featuring 518 high-fidelity tiger videos with consistent stripe patterns. Extensive experiments demonstrate that Re-ID models trained on synthesized tiger data can be directly applied to real-world tiger re-identification. This work opens new perspectives that generative foundation models can be utilized to advance wildlife monitoring, highlighting promising avenues for future ecological applications.
Submission Number: 5
Loading