Progressive Knowledge Distillation For Early Action RecognitionDownload PDFOpen Website

2021 (modified: 02 Nov 2022)ICIP 2021Readers: Everyone
Abstract: We present a novel framework to train a recurrent neural network for early recognition of human actions, which is an important but challenging task given the need to recognize an on-going action based on partial observation. Our framework is based on knowledge distillation, where the network for early recognition is viewed as a student model. The student is trained using knowledge distilled from a more knowledgeable teacher model that can peek into the future and incorporate extra observations about the action in consideration. This framework can be used in both supervised and semi-supervised learning settings, being able to utilize both the labeled and unlabeled training data. Experiments on the UCF101, SYSU 3DHOI, and NTU RGB-D datasets show the effectiveness of knowledge distillation for early recognition, including when we only have a small amount of annotated training data.
0 Replies

Loading