Spatial-Temporal Mutual Distillation for Lightweight Sleep Stage Classification

22 Sept 2023 (modified: 20 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Knowledge Distillation; Sleep Stage Classification; Time Series
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Sleep stage classification has important clinical significance for the diagnosis of sleep-related diseases. Recently, multi-channel sleep signals are widely used in deep neural networks for sleep stage classification and achieve better performance compared to single-channel sleep signals because of the rich spatial-temporal knowledge contained. However, it leads to a great increment in the size and computational costs which constrain the application of multi-channel sleep stage classification models. Knowledge distillation is an effective way to compress models. But existing knowledge distillation methods cannot fully extract and transfer the spatial-temporal knowledge in the multi-channel sleep signals. To solve the problem, we propose a spatial-temporal mutual distillation for multi-channel sleep stage classification. Spatial-temporal knowledge are key references for sleep stage classification. Spatial knowledge represents the spatial relationship of the human body while temporal knowledge means the transition rules between multiple sleep epochs. Moreover, the mutual distillation framework mutually transfer the spatial-temporal knowledge between the teacher and student to improve the knowledge transfer. The results on the ISRUC-III and MASS-SS3 datasets show that our proposed method compresses the sleep models effectively with minimal performance loss and achieves the state-of-the-art performance compared to the baseline methods.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4481
Loading