Keywords: Vision Language Models, Hallucination Mitigation, Inference-time Method, Perception
TL;DR: GIFT is an inference-time hallucination mitigation method that pre-computes a visual saliency map by tracking "gaze shifts" during user query comprehension, and then leverages this map to amplify both visual and query attention during decoding.
Abstract: Vision language models (VLMs) often generate hallucination, i.e., content that cannot be substantiated by either textual or visual inputs. Prior work primarily attributes this to over-reliance on linguistic prior knowledge rather than visual inputs. Some methods attempt to mitigate hallucination by amplifying visual token attention proportionally to their attention scores. However, these methods overlook the visual attention sink problem, where attention is frequently misallocated to task-irrelevant visual regions, and neglect cross-modal fusion balance by enhancing only visual attention without adjusting attention to the user query. This can result in amplifying incorrect areas while failing to properly interpret the user query. To address these challenges, we propose a simple yet effective method called \textbf{G}aze Sh\textbf{i}ft-Guided Cross-modal \textbf{F}usion Enhancemen\textbf{t} (\textbf{GIFT}). GIFT pre-computes a holistic visual saliency map by tracking positive changes in visual attention, or \textit{"gaze shifts"}, during user query comprehension, and leverages this map to amplify attention to both salient visual information and the user query at each decoding step. This reduces the impact of visual attention sink, as irrelevant tokens exhibit minimal shifts, while ensuring balanced cross-modal fusion for well-integrated representation. Extensive experiments show that GIFT effectively mitigates hallucination in VLMs across both generative and classification tasks, achieving up to 20.7\% improvement over greedy decoding, while maintaining general vision-language performance with low computational overhead.
Primary Area: generative models
Submission Number: 14400
Loading