Track: Language Modeling
Keywords: macro placement, chip floorplanning, vision language models, reinforcement learning, in-context learning, evolutionary search, spatial reasoning
TL;DR: In-context learning for macro placement in computer chip design
Abstract: We propose using Vision-Language Models (VLMs) for macro placement in chip floorplanning, a complex optimization task that has recently shown promising advancements through machine learning methods. We hypothesize that the impressive spatial reasoning and understanding capabilities of VLMs can effectively complement existing learning-based approaches. In this work, we introduce VeoPlace (Visual Evolutionary Optimization Placement), a novel framework that uses a VLM to guide the actions of a base policy by constraining them to subregions of the chip canvas. The VLM proposals are iteratively optimized through an evolutionary search strategy with respect to resulting placement quality. On open-source benchmarks, VeoPlace yields state-of-the-art results on four out of seven benchmark circuits, matching or exceeding the performance of prior learning-only approaches. Our approach opens new possibilities for electronic design automation tools that leverage foundation models to solve complex physical design problems.
Serve As Reviewer: ~Ikechukwu_Uchendu1, ~Vincent_Zhuang2
Submission Number: 95
Loading