Forward-backward retraining of recurrent neural networksDownload PDFOpen Website

1995 (modified: 11 Nov 2022)NIPS 1995Readers: Everyone
Abstract: This paper describes the training of a recurrent neural network as the letter posterior probability estimator for a hidden Markov model, off-line handwriting recognition system. The network esti(cid:173) mates posterior distributions for each of a series of frames repre(cid:173) senting sections of a handwritten word. The supervised training algorithm, backpropagation through time, requires target outputs to be provided for each frame. Three methods for deriving these targets are presented. A novel method based upon the forward(cid:173) backward algorithm is found to result in the recognizer with the lowest error rate.
0 Replies

Loading