ShapeY: Measuring Shape Recognition Capacity Using Nearest Neighbor MatchingDownload PDF

Published: 24 Nov 2021, Last Modified: 05 May 2023ImageNet PPF 2021Readers: Everyone
Keywords: Shape benchmark, viewpoint invariance, embedding space, ImageNet, ResNet, transfer learning
TL;DR: We developed a benchmark that measures shape recognition capacity using simple nearest neighbor matching and viewpoint invariance.
Abstract: Object recognition in humans depends primarily on shape cues. We have developed a new approach to measuring the shape recognition performance of a vision system based on nearest neighbor view matching within the system's embedding space. Our performance benchmark, ShapeY, allows for precise control of task difficulty, by enforcing that view matching span a specified degree of 3D viewpoint change and/or appearance change. As a first test case we measured the performance of ResNet50 pre-trained on ImageNet. Matching error rates were high. For example, a 27 degree change in object pitch led ResNet50 to match the incorrect object 45% of the time. Appearance changes were also highly disruptive. Examination of false matches indicates that ResNet50's embedding space is severely "tangled". These findings suggest ShapeY can be a useful tool for charting the progress of artificial vision systems towards human-level shape recognition capabilities.
Submission Track: Main track, 5 pages max
Reviewer Emails: namj@usc.edu
Poster: pdf
1 Reply

Loading