Online Constrained Meta-Learning: Provable Guarantees for Generalization

Published: 21 Sept 2023, Last Modified: 02 Jan 2024NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: meta-learning; generalization
Abstract: Meta-learning has attracted attention due to its strong ability to learn experiences from known tasks, which can speed up and enhance the learning process for new tasks. However, most existing meta-learning approaches only can learn from tasks without any constraint. This paper proposes an online constrained meta-learning framework, which continuously learns meta-knowledge from sequential learning tasks, and the learning tasks are subject to hard constraints. Beyond existing meta-learning analyses, we provide the upper bounds of optimality gaps and constraint violations produced by the proposed framework, which considers the dynamic regret of online learning, as well as the generalization ability of the task-specific models. Moreover, we provide a practical algorithm for the framework, and validate its superior effectiveness through experiments conducted on meta-imitation learning and few-shot image classification.
Supplementary Material: zip
Submission Number: 2568