TL;DR: We propose an adversarial anonymization algorithm to protect potential privacy leakage from the skeleton dataset.
Abstract: Skeleton-based action recognition attracts practitioners and researchers due to the lightweight, compact nature of datasets. Compared with RGB-video-based action recognition, skeleton-based action recognition is a safer way to protect the privacy of subjects while having competitive recognition performance. However, due to improvements in skeleton estimation algorithms as well as motion- and depth- sensors, more details of motion characteristics can be preserved in the skeleton dataset, leading to potential privacy leakage. To investigate the potential privacy leakage from skeleton datasets, we first train a classifier to categorize sensitive private information from trajectories of joints. Our preliminary experiments show that the gender classifier achieves 87% accuracy on average and the re-identification task achieves 80% accuracy on average for three baseline models: Shift-GCN, MS- G3D, and 2s-AGCN. We propose an adversarial anonymization algorithm to protect potential privacy leakage from the skeleton dataset. Experimental results show that an anonymized dataset can reduce the risk of privacy leakage while having marginal effects on action recognition performance.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 5 code implementations](https://www.catalyzex.com/paper/arxiv:2111.15129/code)