Online Sequential Learning from Physiological Data with Weighted Prototypes: Tackling Cross-Subject Variability

ICLR 2025 Conference Submission12787 Authors

28 Sept 2024 (modified: 28 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Online Continual Learning, Physiological Signals, Cross-Subject Variability
Abstract: Online Continual Learning (OCL) enables machine learning models to adapt to sequential data streams in real-time, especially when only a small amount of data is available. However, applying OCL to physiological data such as electroencephalography (EEG) and electrocardiography (ECG) is often complicated by inter-subject variability, which can lead to catastrophic forgetting and performance degradation. Existing OCL methods are currently unable to effectively address this challenge, leading to difficulties in retaining previously learned knowledge while adapting to new data. This paper presents Online Prototypes Weighted Aggregation (OPWA), a novel method specifically designed to address the problem of catastrophic forgetting in the presence of inter-subject variability through the use of prototypical networks. OPWA facilitates the retention of knowledge from past subjects while adapting to new data streams. The OPWA method uses an innovative prototype aggregation mechanism that fuses intra-class prototypes into generalized representations by accounting for both within-class and inter-class variation between subjects. Extensive experiments show that OPWA consistently outperforms existing OCL methods in terms of fast adaptation and mitigation of catastrophic forgetting on different physiological datasets with different modalities, and provides a robust solution for learning on sequential data streams.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12787
Loading