Parameter Monte Carlo Tree Search: Efficient Chip Placement via Transfer Learning

26 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Transfer learning, chip placement, reinforcement learning, Monte Carlo Tree Search, finetuning
TL;DR: By using Monte Carlo Tree Search and optimizing a network's parameters directly, we can successfully transfer a deep RL model from one chip placement task to another.
Abstract: Automated chip placement is an important problem in enhancing the design and effectiveness of computer chips. Previous approaches have employed transfer learning to adapt knowledge obtained via machine learning from one chip placement task to another. However, these approaches have not notably reduced the necessary chip design time, which is crucial for minimizing the total resource utilization. This paper introduces a novel transfer learning approach called Parameter Monte Carlo Tree Search (PMCTS) that utilizes MCTS to transfer the learned knowledge from deep reinforcement learning (RL) models trained on one chip design task to another chip design by searching directly over the model parameters to generate models for efficient chip placement. We employ MCTS to escape the local optima reached by training from scratch and fine-tuning methods. We evaluate our methodology on four chip design tasks from the literature: Ariane, Ariane133, IBM01, and IBM02. Through extensive experiments, we find that our approach can generate models for optimized chip placement in less time than training from scratch and fine-tuning methods when transferring knowledge from complex chip designs to simpler ones.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8341
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview