Abstract: Many data mining applications potentially operate in an adversarial environment where adversaries adapt their behavior to evade detection. Typically adversaries alter data under their control to cause a large divergence of distribution between training and test data. Existing state-of-the-art adversarial learning techniques try to address this problem in which there is only a single type of adversary. In practice, a learner often has to face multiple types of adversaries that may employ different attack tactics. In this paper, we tackle the challenges of multiple types of adversaries with a nested Stackelberg game framework. We demonstrate the effectiveness of our framework with extensive empirical results on both synthetic and real data sets. Our results demonstrate that the nested game framework offers more reliable defense against multiple types of attackers.
0 Replies
Loading