Diagnosing Hallucination Problem in Object Navigation

Published: 16 Jun 2024, Last Modified: 16 Jun 2024CORR, CVPR 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Object Navigation, Object Hallucinations, Model Analysis
Abstract: This work investigates the hallucination problem in object navigation, which leads agents to make incorrect navigation decisions. We identify two kinds of hallucinations: visual grounding and navigation policy. Visual grounding hallucinations are grounding errors from a grounding model that can mislead the agent policy. Policy hallucinations cause the agent to make mistakes even with accurate visual grounding. We analyze how these hallucinations contribute to navigation errors and affect navigation performance, and find that hallucinations about goal objects are the main bottleneck. Finally, we explore the usage of factors like grounding confidence to identify potential directions to mitigate hallucinations in object navigation.
Submission Number: 11
Loading