Keywords: Self-supervised learning, multivariate time series, graph neural networks, seizure analysis
TL;DR: We propose a novel architecture for seizure analysis which learn task-specific networks, jointly with a self-supervised pretraining strategy
Abstract: Recurrent Graph Neural Networks are very effective at modeling brain activity, thanks to their spatial-temporal inductive bias, and they show further capabili- ties when we apply self-supervised pretraining methods. For instance, they show improved performances on epileptic-seizure analysis, namely detection and classi- fication, compared to convolutional and classical recurrent neural networks. Still, the graphs used by these methods are generally predefined, and often provide little insight on the task. We build upon current advancements in graph learning for time series forecasting to propose a novel architecture to learn task-specific networks, jointly with a self-supervised pretraining strategy. We study the performances of learned graphs at different scale, by comparing static and dynamic networks, and illustrate the outstanding performance of our model on epilepsy classification and detection tasks.
Submission Number: 49
Loading