Abstract: Analysis of hand gestures plays a pivotal role in understanding the underlying mood and story of a dance performance. Indian classical dance, Bharatanatyam possesses a predefined set of hand gestures. Automatic recognition of these hand gestures helps transcribe a dance performance’s central story. This paper proposes two pipelines for automatically recognizing hand gestures in the Bharatanatyam dance. Both approaches take the RGB image containing Bharatanatyam hand gestures as input and produces related hand gesture class as output. Our first framework uses a pre-trained mediapipe model for hand detection and Gaussian probabilistic distribution for gesture recognition. This framework hardly needs any training data. In contrast, our second framework trains the YOLOv6 model using the Oxford hand dataset [1], which is used for hand detection. ResNet18 model is trained using Bharatanatyam hand gesture dataset [2], which is then used for hand gesture recognition. Our first framework mainly addresses the data insufficiency problem and uses pre-trained architectures to recognize gestures. This approach is also suitable for finger movement analysis during the transition between mudras. On the contrary, our second framework performs robust hand gesture recognition with an overall recognition accuracy of 98.86%. Comparisons between the results of the two frameworks are presented in the paper, and the advantages of using each framework are also highlighted.
External IDs:dblp:journals/mta/PaulSDR25
Loading