Abstract: Although face recognition starts to play an important role in our daily life, we need to pay attention that data-driven face recognition vision systems are vulnerable to adversarial attacks. However, current digital adversarial attacks and physical adversarial attacks both have drawbacks, with the former ones impractical and the latter one conspicuous, high-computational and low-executable. To address the issues, we propose a practical, executable, stealthy and low computational adversarial attack based on LED illumination modulation. To fool the systems, the proposed attack generates physically imperceptible luminance changes to human eyes through fast intensity modulation of scene LED illumination and uses the rolling shutter effect of CMOS image sensors in face recognition systems to implant luminance information perturbation to the captured face images. In summary, we present a denial-of-service (DoS) attack for face detection and an evasion attack for face verification. We also evaluate their effectiveness against well-known face detection models, Dlib, MTCNN and RetinaFace, and face verification models, Dlib, FaceNet, and ArcFace. The extensive physical experiments show that the success rates of DoS attacks against face detection models reach 97.67$\%$, 100$\%$, and 100$\%$, respectively, and the success rates of evasion attacks against all face verification models reach 100$\%$.
External IDs:dblp:journals/tbd/FangJJLCSYJ25
Loading