Bi-ACTCNet: A Bidirectional Channel Attention and Mutual-Cross-Attention Temporal Feature Extraction Network for Motor-Rest Classification in Motor Imagery
Abstract: Combining brain–computer interface (BCI) technology with the Internet of Things (IoT) for practical motor rehabilitation applications is a critical research direction aimed at enhancing the decoding performance of BCIs and their clinical application value. Currently, BCIs for single rehabilitation movements have started to be applied in clinical settings. However, in the field of electroencephalography (EEG) signal decoding, we note that research specifically focused on motor-rest classification within single motor imagery (MI) tasks remains relatively limited. To address this gap, we propose the bidirectional channel attention and mutual-cross-attention temporal feature extraction network (Bi-ACTCNet), adapted for operation within IoT environments. This network excels in performing Motor-Rest classification across three datasets, yielding better outcomes compared to existing models. A key innovation of Bi-ACTCNet lies in its bidirectional temporal feature extraction method, which reveals that reverse temporal signals in EEG data contain additional features that are advantageous for improving model accuracy. Additionally, the model improves the efficient channel attention (ECA) block by introducing the efficient temporal channel attention (ETCA) block, and for the first time, it utilizes mutual-cross-attention (MCA) to fuse features from the bidirectional branches. Ablation experiments have verified the effectiveness of each block. To further substantiate the model’s effectiveness and generalizability, we executed 2-class classification experiments using the BCI Competition IV-2a dataset, achieved excellent performance.
External IDs:dblp:journals/iotj/CaoCD25
Loading