Keywords: Activity Recognition, Continual Learning, Prompt Tuning, Few-Shot Class Incremental Learning (FSCIL)
TL;DR: We introduce a novel 'Prompt offset tuning' solution for privacy-aware few-shot continual action recognition.
Abstract: As virtual reality and augmented reality is redefining how users interact with computing devices, research in action and gesture recognition is indeed gaining prominence. Typically, these models deployed on AR/VR devices are trained in their factory, with large proprietary datasets. Though this training would cover the major set of activity and gestures classes, the user should ideally be able to add newer
classes to the model, without forgetting the base set of classes. Importantly, the user would be able to provide only few samples per class in this process. In-order to protect the user’s privacy, the setting should also not allow storage and replay of a data sample, for future learning. We formalize this pragmatic problem setting as privacy aware few-shot class incremental learning for activity and gestures.
Towards this end, we propose a novel strategy, POET: Prompt-offset Tuning. Unlike other prompt tuning approaches that demand access to transformer models pretrained on a large amount of data, our approach demonstrates the efficacy of prompting on a significantly smaller model trained exclusively on the data from the base classes. Additionally, we take advantage of the temporal sequencing in the data stream of actions and gestures to propose a unique temporal-ordered learnable prompt selection and prompt attachment. To evaluate our newly proposed problem setting, we introduce new benchmarks on NTU RGB+D dataset for action recognition and SHREC-2017 dataset for hand gesture recognition.
Supplementary Material: pdf
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6237
Loading