CAD-Editor: Text-based CAD Editing through Adapting Large Language Models with Synthetic Data

ICLR 2025 Conference Submission13221 Authors

28 Sept 2024 (modified: 28 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Computer Aided Design, Generative Models, Text-based Editing, Large Language Models
TL;DR: This paper introduces CADEditor, a novel generative model that enables precise text-based editing of CAD designs.
Abstract: Computer Aided Design (CAD) is indispensable across various industries. \emph{Text-based CAD editing}, which automatically modifies CAD models following textual instructions, is important yet not extensively studied. Existing work explores design variation generation, which randomly alters specific parts of a CAD model, offering no control over the final appearance. This work introduces \emph{CAD-Editor} for text-based editing. We leverage Large Language Models (LLMs) as the backbone to take the concatenation of textual instruction and original CAD sequence as input and predict the edited CAD sequence, where the sequence representation of a CAD model is designed for easier processing by LLMs. Moreover, we propose fine-tuning LLMs by using a synthetic dataset followed by a selective dataset. The synthetic data is produced by leveraging powerful existing models, including design variation generation models for producing paired CAD models and multi-modal models for capturing textual differences between these pairs. The selective data is created by choosing top examples from outputs of the initially fine-tuned LLMs based on human feedback or metrics. In this way, a large-scale synthetic dataset offers basic capability while a selective dataset that is less noisy and better aligned with human intentions boosts performance further. Extensive experiments demonstrate the advantage of CAD-Editor both quantitatively and qualitatively.
Primary Area: applications to computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 13221
Loading