One Bad Sample May Spoil the Whole Batch: A Novel Backdoor-Like Attack Towards Large Batch Processing

ICLR 2026 Conference Submission15259 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Backdoor Attack, Batch Processing
Abstract: As hardware accelerators like TPUs and large-memory GPUs continue to evolve rapidly, an increasing number of Artificial Intelligence (AI) applications are utilizing extremely large batch sizes to accelerate their Deep Learning (DL) processes. To optimize DL processing, Batch Normalization (BN) layers in DL models rely on batch statistics that are accurate and reliable enough when working with large batch sizes. However, batch statistics allow for knowledge transfer between samples within the same batch. This characteristic can be exploited by adversaries, posing various potential security threats. To reveal the danger of the security threats, in this paper, we introduce a novel Batch-Oriented Backdoor Attack named \textit{BOBA}, which aims to control the classification results of all the samples in a batch by poisoning only one of them. Specifically, we present an effective trigger derivation mechanism that generates specific triggers for a given trained target model, thereby maximizing the impact of a poisoned sample on the classification results of other clean samples. Meanwhile, we propose a contrastive contamination-based retraining method for backdoor injection using samples poisoned by the derived triggers. In this way, when dealing with a batch that includes one poisoned sample, the retrained model will predict the given attack target category. Comprehensive experimental results obtained from various well-known datasets demonstrate the effectiveness of BOBA. Notably, for CIFAR-10, BOBA can make 848 of 1024 samples within a batch misclassified when manipulating only 10 poisoned samples, indicating the harmfulness of security risks in the BN layers.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 15259
Loading