Confirmation: I have read and agree with the IEEE BHI 2025 conference submission's policy on behalf of myself and my co-authors.
Keywords: Optomyography, Machine learning, Augmentation, Motor decoding, Electromyography, Human-machine interaction, Brain computer interfaces
Abstract: Despite significant advancements in brain-computer interface (BCI) technology, systems capable of leveraging physiological signals to detect and recognize human intentions in real-time are still underdeveloped. To achieve a new level of human-machine interaction, it is essential to integrate motor activity correlates with state-of-the-art artificial intelligence (AI) architectures. In this study, we present the first demonstration of handwriting decoding—a complex motor task—using a novel myographic method called Optomyography (OMG). Unlike previous electromyography (EMG)-based approaches that treat handwriting decoding as a classification problem, we frame it as a continuous trajectory reconstruction challenge. We evaluated GRUScribe (GRU-based decoder) and TransScribe (transformer-based decoder), successfully decoding 10 numerical digits and 33 Russian letters from 20 able-bodied and 4 amputee participants, without requiring elaborate preprocessing. Our results demonstrate the remarkable potential of OMG for recognizing complex motor activity. We believe that our work sets a new benchmark in non-invasive muscle activity decoding, offering direct applications in advanced prosthetic control and human-machine interfaces.
Track: 7. General Track
Registration Id: GKN33NP4YGJ
Submission Number: 83
Loading