Track: Paper
Keywords: Machine Learning, Time-series Classification, Dance Performance, New Media Arts, Motion Recognition, Human Activity Recognition, Human-Computer Interaction, Interactive Machine Learning, Inertial Measurement Unit (IMU) Sensor, MiniRocket
Abstract: We introduce a lightweight, real-time motion recognition system that enables synergic human-machine performance through wearable IMU sensor data, MiniRocket time-series classification, and responsive multimedia control. By mapping dancer-specific movement to sound through somatic memory and association, we propose an alternative approach to human-machine collaboration, one that preserves the expressive depth of the performing body while leveraging machine learning for attentive observation and responsiveness. We demonstrate that this human-centered design reliably supports high accuracy classification (<50 ms latency), offering a replicable framework to integrate dance-literate machines into creative, educational, and live performance contexts.
Submission Number: 43
Loading