Keywords: Dermoscopy, Vulnerabilities of Deep Learning, Adversarial Examples, Physical World Attacks, Real Clinical Attacks, Skin Cancer
TL;DR: We successfully attack Deep Learning for skin lesion diagnosis with simple physical world attacks showing its susceptibility.
Abstract: Deep Learning (DL)-based diagnostic systems are getting approved for usage as fully automatic or secondary opinion products. This development derives from the achievement of expert-level performance by DL across several applications (e.g. dermoscopy and diabetic retinopathy). While recent literature shows their vulnerability to imperceptible digital manipulation of the image data (e.g. through cyberattacks), the performance of medical DL systems under physical world attacks is not yet explored. This problem demands attention if we want to safely translate medical DL research into clinical practice. In this paper, we design the first small-scale prospective evaluation addressing the vulnerability of DL-dermoscopy systems under physical world attacks in absentia of knowledge about the underlying DL-architecture. We publish the entire dataset of collected images as Physical Attacks on Dermoscopy (PADv1) for public use. The evaluation of susceptibility and robustness reveals that such attacks lead to on average 31% accuracy loss across popular DL-architectures. The DL diagnosis is changed by the attack in one of two cases even without any knowledge of the DL method.
Code Of Conduct: I have read and accept the code of conduct.
9 Replies
Loading