Adversarial Visual Contrastive Decoding for Mitigating Hallucinations in Large Vision-Language Models

16 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: LVLMs, Hallucination, AVCD, Adversarial examples, Contrastive decoding
TL;DR: We propose AVCD, a training-free method that repurposes adversarial examples as potent, directional signals to effectively suppress hallucinations in Large Vision-Language Models.
Abstract: Large Vision-Language Models (LVLMs) have achieved remarkable progress in various multimodal AI tasks but are prone to generating hallucinations—outputs that are plausible-sounding yet factually incorrect or ungrounded in the visual input. This phenomenon, particularly the generation of non-existent objects or misdescribed object attributes, severely undermines model reliability. This paper introduces Adversarial Visual Contrastive Decoding (AVCD), a novel inference methodology designed to mitigate hallucinations in LVLMs. AVCD refines the existing Visual Contrastive Decoding (VCD) framework by replacing its use of random noise with adversarial perturbations. These adversarial images are specifically engineered to perturb the vision encoder’s features to decrease cosine similarity with the original image. Our analysis reveals that unlike random noise, these adversarial perturbations are directional; they actively steer the model toward hallucinatory states rather than simply degrading visual features. This creates a more potent and informative contrastive signal, enabling a more effective suppression of hallucinatory content. Experiments on standard benchmarks indicate that the proposed AVCD method achieves notable performance improvements over VCD and other baseline techniques. This work underscores the potential of leveraging adversarial principles not merely for identifying model vulnerabilities but as a constructive tool in enhancing the faithfulness and reliability of LVLM outputs.
Primary Area: foundation or frontier models, including LLMs
Submission Number: 7055
Loading