Keywords: seizure detection, eeg, traffic forecasting, graph signal processing, attention, transformer
TL;DR: We propose a new self-attention attending all nodes over extended time periods boosting performance in EEG seizure detection and traffic forecasting
Abstract: This work introduces a Transformer-based approach for graph signal processing that leverages a novel task-specific attention mechanism, namely NTAttention.
Unlike conventional self-attention mechanisms, our method attends to all nodes across multiple time steps, enabling the model to effectively capture dependencies between nodes over extended time periods. This addresses a key limitation faced by traditional methods.
Additionally, we propose geometry-aware masking (GMask), which incorporates the graph topology into the sparsification of the self-attention matrix. This enhances efficiency while preserving the rich temporal information conveyed by the nodes.
We demonstrate the effectiveness of our approach on two critical applications: EEG seizure detection and traffic forecasting. Both tasks involve data collected from fixed sensors, such as electrodes or road sensors, where data from one sensor can influence others temporally and spatially. Our model enhances sensitivity in fast seizure detection by 20 percentage points compared to state-of-the-art and significantly outperforms current methods in traffic forecasting.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13543
Loading