A Contextual Discretization framework for compressing Recurrent Neural NetworksDownload PDF

18 Apr 2024 (modified: 17 Feb 2017)ICLR 2017 workshop submissionReaders: Everyone
Abstract: In this paper, we address the issue of training Recurrent Neural Networks with binary weights and introduce a novel Contextualized Discretization (CD) framework and showcase its effectiveness across multiple RNN architectures and two disparate tasks. We also propose a modified GRU architecture that allows harnessing the CD method and reclaim the exclusive usage of weights in $\{-1, 1\}$, which in turn reduces the number of power-two bit multiplications from $O(n^3)$ to $O(n^2)$.
TL;DR: A new technique for weight binarized compression of Recurrent Neural Networks
Conflicts: none@none.com
Keywords: Deep learning, Supervised Learning, Applications
3 Replies

Loading