Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
On Improving the Numerical Stability of Winograd Convolutions
Kevin Vincent, Kevin Stephano, Michael Frumkin, Boris Ginsburg, Julien Demouth
Feb 17, 2017 (modified: Feb 18, 2017)ICLR 2017 workshop submissionreaders: everyone
Abstract:Deep convolutional neural networks rely on heavily optimized convolution algorithms. Winograd convolutions provide an efficient approach to performing such convolutions. Using larger Winograd convolution tiles, the convolution will become more efficient but less numerically accurate. Here we provide some approaches to mitigating this numerical inaccuracy. We will exemplify these approaches by working on a tile much larger than any previously documented: F(9x9, 5x5). Using these approaches, we will show that such a tile can be used to train modern networks and provide performance benefits.
TL;DR:By improving the numerical stability of Winograd convolutions, we are able to use larger tiles which provide performance benefits to convolutional neural networks.
Enter your feedback below and we'll get back to you as soon as possible.