Abstract: Long Short-Term Memory (LSTM) cells are modified Recurrent Neural Networks
(RNNs) that can learn long temporal dependencies in sequence data. The basic architecture of
an LSTM layer is a unit called a memory cell. It has a recurrent connection to itself and several
activation gates that regulate the flow of information in and out of the cell. It retains a memory
state within the network representing relevant information learned from the input time series.
(Hochreiter and Schmidhuber, 1997). LSTM networks are state of the art in many fields,
including natural language processing (K. Greff, 2017). The ability to approximate dynamical
time-variant systems (Li, X. D., 2005) and learn temporal patterns that span large intervals
makes LSTM based architectures a logical choice for classifying EEG signals (Tsiouris, Κ. Μ.,
2018). There are many variants of this architecture, of which bidirectional LSTM (BiLSTM)
cells are of particular interest to classifying EEG data. One can visualize this layer as two
standard memory cells parsing the data in opposite directions, enabling the individual cells to
update learned representations using either past or future time points. The utilization of future
time points to predict the current cell state necessitates that the signal is a complete-time series
and not an evolving sequence. (Schuster and Paliwal, 1997).
The average cross-validation accuracy of the proposed classifier is
89.51%, with a standard deviation of 4.7. This work suggests that the BiLSTM classifier can
provide a robust framework for EEG mental workload classification when coupled with
spectral and non-linear features.
0 Replies
Loading