Temporal Hierarchies in Sequence to Sequence for Sentence CorrectionDownload PDFOpen Website

2018 (modified: 10 Nov 2021)IJCNN 2018Readers: Everyone
Abstract: This work tackles sentence correction in the lan-guage domain by approaching it as a sequence to sequence (seq2seq) problem with the help of temporal hierarchies. It does so by implementing a Multiple Timescales model of the Gated Recurrent Unit (MTGRU) in a Recurrent Neural Network (RNN) Encoder-Decoder framework, which can perform more meaningful data abstraction even in the presence of errors. The proposed language correction model is compared to three baseline models: conventional RNN, Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU); by using a newly built dataset that consists of incorrect and correct sentences as input and target respectively. The result shows that the MTGRU model has a better generalization performance and outperforms all three models on the BLEU-n evaluation metric.
0 Replies

Loading