A Flexible Memory Controller Supporting Deep Belief Networks with Fixed-Point ArithmeticDownload PDFOpen Website

Published: 01 Jan 2013, Last Modified: 11 Nov 2023IPDPS Workshops 2013Readers: Everyone
Abstract: Deep Belief Networks (DBNs) are state-of-art Machine Learning techniques and one of the most important unsupervised learning algorithms. Training DBNs is computationally intensive which naturally leads to investigate FPGA acceleration. Fixed-point arithmetic can have an important influence on the execution time and prediction accuracy of a DBN. Previous studies have focused only on customized DBN accelerators with a fixed data-width. Our results experiments demonstrate that supporting various data-widths in different DBN configurations and application environments does make sense for achieving acceptable performance. From this we conclude that a DBN accelerator should support various data-widths rather than the fixed one as done in previous work. The processing performance of DBN accelerators in FPGA is almost always constrained not by the capacity of the processing units, but by their on-chip RAM capacity and speed. We propose an efficient memory controller for DBN accelerators, which shows that supporting various data-widths is not as difficult as it may sound. The cost is only little in hardware terms and does not affect the critical path. We have designed a tool to help users reconfiguring the memory controller with arbitrary data-width flexibly.
0 Replies

Loading