Abstract: Recently, backdoor attack, which aims to implant malicious logic into deep learning models (DLMs), has attracted so extensive research attention. Among them, the non-poisoning-based backdoor attack appears considerable development prospects owing to the posed threats against the DLMs-based artificial intelligence applications in cyberspace. However, previous non-poisoning-based backdoor attacks for DLMs are limited to the impractical attacking forms, resulting in certain weaknesses in both attacking complexity and attacking adaptability. To tackle the mentioned issues, this paper proposes a novel backdoor attack framework, namely the shell code injection (SCI), to perform backdoor attacks against DLMs with lower complexity and higher adaptability. Specifically, for alleviating the attacking complexity, we elaborate the logic-driven stealthy backdoor shell motivated by the biological behavior in nature, e.g., the camouflage and attack strategy of crabs. By introducing the trigger consistency verification and short-circuit code packaging strategies, the SCI misleads the victim models to output wrong predictions without training requirements according to the preset poisonous decision logic. For enhancing the attacking adaptability, we design the LLM-assisted adaptive attacking target code generation that consists of the model concept detection module and the attack target adjusting module. Since the attacking goals could be generated dynamically according to the aware victim model information and appointed attacker preset instructions, the SCI could achieve more flexible attacking performance. Extensive experiments are conducted to demonstrate that the proposed backdoor attack framework appears awesome attacking ability (almost 100% ASR) under various settings. Additionally, we provide a case study on combining the cyber attack with SCI, which also exhibits certain space for imagination of new-type backdoor attacks. The code is released at https://github.com/WDQhello/Shell_attack/
External IDs:doi:10.1109/tifs.2026.3662587
Loading