Learning to Learn with Conditional Class DependenciesDownload PDF

27 Sept 2018, 22:39 (modified: 23 Jan 2023, 18:06)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Keywords: meta-learning, learning to learn, few-shot learning
TL;DR: CAML is an instance of MAML with conditional class dependencies.
Abstract: Neural networks can learn to extract statistical properties from data, but they seldom make use of structured information from the label space to help representation learning. Although some label structure can implicitly be obtained when training on huge amounts of data, in a few-shot learning context where little data is available, making explicit use of the label structure can inform the model to reshape the representation space to reflect a global sense of class dependencies. We propose a meta-learning framework, Conditional class-Aware Meta-Learning (CAML), that conditionally transforms feature representations based on a metric space that is trained to capture inter-class dependencies. This enables a conditional modulation of the feature representations of the base-learner to impose regularities informed by the label space. Experiments show that the conditional transformation in CAML leads to more disentangled representations and achieves competitive results on the miniImageNet benchmark.
Data: [mini-Imagenet](https://paperswithcode.com/dataset/mini-imagenet)
8 Replies

Loading