Abstract: Brain-inspired hyperdimensional computing (HDC) has shown promise for highly accurate EMG-based gesture recognition owing to its few-shot learning capabilities, and robustness to noise and electrode placement variability. The simplistic paradigm is also ultra low-power and low latency, potentially enabling local and fast closed-loop control of assistive prosthetic devices. In this work, we propose a novel fully HDC multi-level sensor fusion prosthetic control scheme. While prior work has only utilized HDC for recognition of static gestures held for 5 seconds, prosthetic control depends on task recognition with continuously changing EMG signals. To achieve this, we collect a multi-sensor dataset for 6 activities of daily living with feedback sensors, detailed sub-tasks and two levels of timing complexity. Finally, we demonstrate task recognition accuracy of 91% on the continuously changing EMG data, contributing a basis on which higher levels of control can be designed.
Loading