LLM-IMC: Automating Analog In-Memory Computing Architecture Generation with Large Language Models

Published: 2025, Last Modified: 12 Nov 2025FCCM 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Resistive crossbars enabling analog In-Memory Computing (IMC) have garnered significant attention from academia and industry as a promising architecture for Deep Neural Network (DNN) acceleration, thanks to their high memory access bandwidth and in-situ computing capabilities. However, the knowledge-intensive hardware design process and the lack of high-quality circuit netlists have constrained design space exploration and optimization of analog IMC to behavioral system-level tools. In this one-page abstract, we introduce LLM-IMC, a novel fine-tune-free Large Language Model (LLM) framework, supported by a Python-based tool, designed for analog IMC SPICE code generation. LLM-IMC systematically addresses these limitations by automating the creation of diverse IMC simulation scripts, enabling efficient design space exploration through LLM-driven performance, and outlining an integration roadmap for hardware-oriented neuromorphic crossbar design flows.
Loading