Few-Shot Attribute LearningDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Few-shot learning, transfer learning, attribute learning
Abstract: Semantic concepts are often defined by a combination of attributes. The use of attributes also facilitates learning of new concepts with zero or few examples. However, the zero-shot learning paradigm assumes that the set of attributes are known and fixed, which is a limitation if a test-time task depends on a previously irrelevant attribute. In this work we study rapid learning of attributes that are previously not labeled in the dataset. Compared to standard few-shot learning of semantic classes, learning new attributes imposes a stiffer challenge. We found that directly supervising the model with a set of training attributes does not generalize well on the test attributes, whereas self-supervised pre-training brings significant improvement. We further experimented with random splits of the attribute space and found that the predictability of attributes provides an informative estimate of a model's ability to generalize.
21 Replies

Loading