PCNN: Probable-Class Nearest-Neighbor Explanations Improve Fine-Grained Image Classification Accuracy for AIs and Humans

TMLR Paper2552 Authors

19 Apr 2024 (modified: 03 Jul 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Nearest neighbors (NN) are traditionally used to compute final decisions, e.g., in Support Vector Machines or k-NN classifiers, and to provide users with explanations for the model's decision. In this paper, we show a novel utility of nearest neighbors: To improve predictions of a frozen, pretrained classifier C. We leverage an image comparator S that (1) compares the input image with NN images from the top-K most probable classes; and (2) uses S's output scores to weight the confidence scores of C. Our method consistently improves fine-grained image classification accuracy on CUB-200, Cars-196, and Dogs-120. Also, a human study finds that showing lay users our probable-class nearest neighbors (PCNN) improves their decision accuracy over prior work which only shows only the top-1 class examples.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Added experiments (Sec. E) and revised writing per requests from Reviewers.
Assigned Action Editor: ~Sivan_Sabato1
Submission Number: 2552
Loading