Federated Transformer-based Lightweight Modeling for Epilepsy Prediction using Twofold Personalization

Published: 25 Sept 2024, Last Modified: 06 Nov 2024IEEE BHI'24EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Client-level, Epilepsy Prediction, Federated Transformer, Hyper network, Knowledge Distillation, Personalization, Multimodal, Patient-Specific Personalization, and TwoFold Personalization
Abstract: Automated epilepsy diagnosis research aims to improve prediction models using Electroencephalography (EEG) signals. Federated Learning (FL) preserves medical data privacy while accessing knowledge from multiple clients. Despite addressing data scarcity through collaboration, designing lightweight and personalized predictions with federated transformers for distributed EEG data is challenging. Additional modalities provide complementary knowledge for enhanced predictions. Developing a lightweight model for early, accurate personalized seizure prediction from multimodal signals offers significant opportunities. This work introduces client-level and patient-specific personalization using a federated transformer model. The self-attention mechanism in federated transformers can negatively impact results due to data heterogeneity, limiting collaboration. Hypernetwork addresses this by learning personalized self-attention layers, generating EEG-representative attention maps, and eliminating global self-attention aggregation. Model parameters are aggregated globally. Client-level personalization uses a local transformer (teacher model) for prediction. Knowledge distillation creates a lightweight patient-level model (student model) from teacher model weights, integrating multimodal signals for patient-specific prediction. Validated with the MIT-CHB dataset, this approach accurately determines the preictal state, outperforming existing models proved by potential outcome of 95.58% sensitivity and 98.61% specificity. Also, the proposed approach yielded only a 0.014 False Positive Rate (FPR) while finetuning the student model of each hospital with the multimodal data by the federated-guided generalized knowledge from the teacher model.
Track: 11. General Track
Registration Id: 83N3PJXYBLX
Submission Number: 293
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview