Geoclidean: Few-Shot Generalization in Euclidean GeometryDownload PDF

Published: 17 Sept 2022, Last Modified: 12 Mar 2024NeurIPS 2022 Datasets and Benchmarks Readers: Everyone
Keywords: geometry, concept learning, few-shot generalization
TL;DR: A study of few-shot generalization of human and vision models in Euclidean geometry concepts.
Abstract: Euclidean geometry is among the earliest forms of mathematical thinking. While the geometric primitives underlying its constructions, such as perfect lines and circles, do not often occur in the natural world, humans rarely struggle to perceive and reason with them. Will computer vision models trained on natural images show the same sensitivity to Euclidean geometry? Here we explore these questions by studying few-shot generalization in the universe of Euclidean geometry constructions. We introduce Geoclidean, a domain-specific language for Euclidean geometry, and use it to generate two datasets of geometric concept learning tasks for benchmarking generalization judgements of humans and machines. We find that humans are indeed sensitive to Euclidean geometry and generalize strongly from a few visual examples of a geometric concept. In contrast, low-level and high-level visual features from standard computer vision models pretrained on natural images do not support correct generalization. Thus Geoclidean represents a novel few-shot generalization benchmark for geometric concept learning, where the performance of humans and of AI models diverge. The Geoclidean framework and dataset are publicly available for download.
URL: https://downloads.cs.stanford.edu/viscam/Geoclidean/geoclidean.zip
Dataset Url: Framework: https://github.com/joyhsu0504/geoclidean_framework Dataset: https://downloads.cs.stanford.edu/viscam/Geoclidean/geoclidean.zip
License: CC-BY 4.0
Author Statement: Yes
Supplementary Material: pdf
Contribution Process Agreement: Yes
In Person Attendance: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2211.16663/code)
15 Replies

Loading