Improving Large Language Model Hardware Generating Quality through Post-LLM Search

NeurIPS 2023 Workshop MLSys Submission12 Authors

Published: 28 Oct 2023, Last Modified: 12 Dec 2023MlSys Workshop NeurIPS 2023 PosterEveryoneRevisionsBibTeX
Keywords: Large Language Model, Natural Language Programming, Hardware Design
Abstract: As large language models (LLMs) like ChatGPT exhibited unprecedented machine intelligence, it also shows great performance in assisting hardware engineers to realize higher-efficiency logic design via natural language interaction. However, due to the limitation of LLM, existing LLM-based hardware generating frameworks generate verilog register transfer language(RTL) without considering its performance, power, area(PPA). To overcome this challenge, we design a post LLM search approach to **merge design space exploration(DSE) process into current LLM hardware generation workflow**, which enables the PPA optimization. At first, our framework begins by generating prompts for the LLM, which then produces initial Verilog programs. Second, an output manager corrects and optimizes these programs before collecting them into the final design space, which is constructed as a HDL search tree. Eventually, the most important post-search stage, our work will do search through this space to select the optimal design under the target metrics. The evaluation shows that our approach improves generating Verilog quality, and shows broader design optimization space compared to prior work and native LLMs alone.
Submission Number: 12