BLUnet: Arithmetic-free Inference with Bit-serialised Table Lookup Operation for Efficient Deep Neural NetworksDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Efficient Inference, Deep Neural Networks
Abstract: Deep neural networks (DNNs) are both computation and memory intensive. Large amounts of costly arithmetic multiply-accumulate (MAC) operations and data movement hinder its application to edge AI where DNN models are required to run on energy-constrained platforms. Table lookup operations have potential advantages over traditional arithmetic multiplication and addition operations in terms of both energy consumption and latency in hardware implementations for DNN design. Moreover, the integration of weights into the table lookup operation eliminates costly weight movements. However, the challenge of using table lookups is in scaling. In particular, the size and lookup times of tables grow exponentially with the fan-in of the tables. In this paper, we propose BLUnet, a table lookup-based DNN model with bit-serialized input to overcome this challenge. Using binarized time series inputs, we successfully solve the fan-in issue of lookup tables. BLUnet not only achieves high efficiency but also the same accuracies as MAC-based neural networks. We experimented with popular models in computer vision applications to confirm this. Our experimental results show that compared to MAC-based baseline designs as well as the state-of-the-art solutions, BLUnet achieves orders of magnitude improvement in energy efficiencies.
One-sentence Summary: We propose BLUnet, a bit-serialised table lookup-based DNN model design for efficient inference.
5 Replies

Loading