Deep Neural Network-Based Accelerators for Repetitive Boolean Logic Evaluation

Published: 01 Jan 2023, Last Modified: 02 Oct 2024SOCC 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper shows the learnability of complex Boolean Logic functions by Deep Neural Networks (DNNs) and the ability to accelerate logic evaluation using DNNs while ensuring very high accuracy. A Fully Connected Neural Network (FCNN) and a novel Convolutional Neural Network (CNN) were used in the experiments. Results are presented considering more than 5000 Boolean functions from ISCAS’89 and ITC’99 benchmarks. The FCNN showed an average accuracy of 96.6% while the CNN model had a superior average accuracy of 98.6%. The CNN executed on Graphics Processing Units (GPUs) was 69 times faster than a conventional logic simulator executed on a Central Processing Unit (CPU). It was approximately 190,000 times faster when utilizing Memristor Crossbar Arrays (MCAs) and the power consumption was 4,000 times less compared to the logic simulator. Moreover, the CNN implemented on MCAs was 2.3 times faster and consumed 5.6 times less power than Field Programmable Gate Array (FPGA) based logic evaluation. Experimental results of the complex Boolean function evaluations indicate the potential of developing DNN-based accelerators that outperform the existing conventional methods used to evaluate logic circuits.
Loading