Abstract: Deep convolutional neural networks rely on heavily optimized convolution algorithms. Winograd convolutions provide an efficient approach to performing such convolutions. Using larger Winograd convolution tiles, the convolution will become more efficient but less numerically accurate. Here we provide some approaches to mitigating this numerical inaccuracy. We will exemplify these approaches by working on a tile much larger than any previously documented: F(9x9, 5x5). Using these approaches, we will show that such a tile can be used to train modern networks and provide performance benefits.
TL;DR: By improving the numerical stability of Winograd convolutions, we are able to use larger tiles which provide performance benefits to convolutional neural networks.
Keywords: Deep learning
Conflicts: nvidia.com
4 Replies
Loading