SELECT: Search-Enhanced Language Models for Analog Circuit Topology Generation

ICLR 2026 Conference Submission12942 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Analog Topology Generation
Abstract: Automating analog circuit topology design is essential to reduce the extensive manual effort required to meet increasingly diverse and customized application demands. Recent advances have applied sequence-to-sequence fine-tuning on pretrained language models to directly generate circuit topologies from user specifications in a single pass. However, these one-shot generation methods failed to generate complex circuits due to their exponentially growing search spaces and limited training datasets. In this paper, we present SELECT, a search-enhanced language model framework that integrates simulator-guided Monte Carlo Tree Search (MCTS) with transformer-based decoding to use test-time computation for improved performance. SELECT introduces novel structural token pruning and P-UCB-based node selection to leverage next-token probability distributions to guide the search process. By combining pretrained priors with simulator feedback at inference time, SELECT converges faster than prior search methods and achieves significantly higher generation success rates, improving by up to 435\% over RL-based search and 145\% over LaMAGIC under a strict tolerance of 0.01. These results establish SELECT as the first scalable framework for high-fidelity analog topology generation and a practical step toward LLM-driven circuit design automation.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 12942
Loading