Task-Specific Gesture Analysis in Real-Time Using Interpolated Views

Published: 1996, Last Modified: 13 Nov 2024IEEE Trans. Pattern Anal. Mach. Intell. 1996EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Hand and face gestures are modeled using an appearance-based approach in which patterns are represented as a vector of similarity scores to a set of view models defined in space and time. These view models are learned from examples using unsupervised clustering techniques. A supervised teaming paradigm is then used to interpolate view scores into a task-dependent coordinate system appropriate for recognition and control tasks. We apply this analysis to the problem of context-specific gesture interpolation and recognition, and demonstrate real-time systems which perform these tasks.
Loading