Knowledge Guided Bayesian Flow Network for CAD Sequence Generation

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Parametric CAD modeling, Bayesian flow network, Quantitatively constrained generation
Abstract: The controllable generation of parametric CAD sequences under explicit quantitative constraints (e.g., surface area, volume) is crucial for automating design processes, as it enables the efficient and precise creation of complex geometric models that meet predefined functional or physical requirements. However, this task remains highly challenging due to the multimodal nature of CAD data, which combines discrete commands and continuous parameters, as well as the long-range dependencies among parameters that are critical for satisfying the constraints. While deep generative models have shown remarkable progress in various domains, they still struggle with parametric CAD sequence generation under strict quantitative constraints. To tackle this, we propose a generative framework based on a Knowledge-Guided Bayesian Flow Network (KGBFN). Our approach leverages Bayesian flow to jointly model discrete and continuous variables, effectively capturing the complex structure of CAD data. Moreover, we introduce a knowledge-guided Bayesian update strategy that iteratively injects property constraints during the generation process, significantly enhancing the accuracy of the produced sequences. To improve computational efficiency, we design a dual-channel Bayesian flow network that integrates both traditional and knowledge-guided updates, employing an annealing mechanism to dynamically control the activation of different channels. This design effectively balances knowledge guidance with optimization efficiency. We validate our method on CAD generation tasks constrained by quantitative properties such as surface area and volume. Experimental results demonstrate that our model consistently outperforms state-of-the-art methods in both single and multi-condition constrained generation, achieving superior performance in terms of accuracy and feasibility.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 10926
Loading