A Smoothing Regularizer for Recurrent Neural NetworksDownload PDFOpen Website

1995 (modified: 11 Nov 2022)NIPS 1995Readers: Everyone
Abstract: We derive a smoothing regularizer for recurrent network models by requiring robustness in prediction performance to perturbations of the training data. The regularizer can be viewed as a generaliza(cid:173) tion of the first order Tikhonov stabilizer to dynamic models. The closed-form expression of the regularizer covers both time-lagged and simultaneous recurrent nets, with feedforward nets and one(cid:173) layer linear nets as special cases. We have successfully tested this regularizer in a number of case studies and found that it performs better than standard quadratic weight decay. 1
0 Replies

Loading