Not All CAMs Are Complete: Completeness as the Key to Faithfulness

Published: 10 May 2026, Last Modified: 10 May 2026Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Although input-gradient techniques have evolved to mitigate the challenges associated with gradients, modern gradient-weighted CAM approaches still rely on vanilla gradients, which are inherently susceptible to the saturation phenomena. Despite recent enhancements that incorporate counterfactual gradient strategies as a mitigating measure, these local explanation techniques still exhibit a lack of sensitivity to their baseline parameter. Our work introduces a general distributional framework for gradient-based CAMs that recovers Integrated Grad-CAM and SmoothGrad-CAM as special cases of a single perturbation distribution, and from which we derive optimal weights minimizing explanation infidelity, an optimality we prove is governed by completeness as both a necessary and sufficient axiom. Consequently, methods that violate completeness, such as SmoothGrad-based variants, are provably suboptimal. Our technique, Expected Grad-CAM, instantiates this optimum via Expected Gradients and data-aware perturbations, purposefully designed as an enhanced substitute of the foundational Grad-CAM algorithm and any method built therefrom. By revisiting the original formulation as the smoothed expectation of the perturbed integrated gradients, one can concurrently construct more faithful, localized and robust explanations; through fine modulation of the perturbation distribution, it is possible to regulate the explanation complexity by selectively discriminating stable features. Quantitative and qualitative evaluations have been conducted to assess the effectiveness of our method.
Submission Type: Long submission (more than 12 pages of main content)
Changes Since Last Submission: 1. Abstract minimally updated to surface contributions added during the rebuttal phase; no new claims, numbers, or scope. 2. Two section headings ("Related Work" and "Conclusion and Broader Impact") normalized to title case. 3. Code-availability footnote added to the abstract pointing to the public GitHub repository.
Code: https://github.com/espressoshock/pytorch-expected-gradcam
Supplementary Material: zip
Assigned Action Editor: ~Hadi_Jamali-Rad1
Submission Number: 6944
Loading