Abstract: The development of face attribute recognition technology has enhanced the intelligence capabilities in the retail industry. Merchants use the surveillance system to capture customers’ face images, and analyze their basic characteristics to provide accurate product recommendations and optimize product configurations. However, these captured face images may contain sensitive visual information, especially identity-related data, which could lead to potential security and privacy risks. Current methods for face privacy protection cannot fully support privacy preserving face attributes classification. To this end, this paper proposes a privacy protection scheme that employs differential privacy in the frequency domain to mitigate risks in face attribute classification systems. Our main goal is to take the frequency domain features perturbed with differential privacy as the input of the face attribute classification model to resist privacy attacks. Specifically, the proposed scheme first transforms the original face image into the frequency domain using the discrete cosine transform (DCT) and removes the DC components that contain the visual information. Then the privacy budget allocation in the differential privacy framework is optimized based on the loss of the face attribute classification network. Finally, the corresponding differential privacy noise is added to the frequency representation. The utilization of differential privacy theoretically provides privacy guarantees. Sufficient experimental results show that the proposed scheme can well balance the privacy-utility.
Loading