Targeted control of fast prototyping through domain-specific interface

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY-NC-ND 4.0
Abstract: Industrial designers have long sought a natural and intuitive way to achieve the targeted control of prototype models---using simple natural language instructions to configure and adjust the models seamlessly according to their intentions, without relying on complex modeling commands. While Large Language Models have shown promise in this area, their potential for controlling prototype models through language remains partially underutilized. This limitation stems from gaps between designers' languages and modeling languages, including mismatch in abstraction levels, fluctuation in semantic precision, and divergence in lexical scopes. To bridge these gaps, we propose an interface architecture that serves as a medium between the two languages. Grounded in design principles derived from a systematic investigation of fast prototyping practices, we devise the interface's operational mechanism and develop an algorithm for its automated domain specification. Both machine-based evaluations and human studies on fast prototyping across various product design domains demonstrate the interface's potential to function as an auxiliary module for Large Language Models, enabling precise and effective targeted control of prototype models.
Lay Summary: Imagine a sculptor working with clay---they can feel, shape, and instantly see their vision come to life. Product designers dream of this same fluid experience when creating digital models on computers. The problem? There's a massive communication gap. The way designers think about products---envisioning curves, textures, and functions---is completely different from the technical language computers need to build digital models. It's like trying to paint a masterpiece while speaking through a translator who only knows engineering terms. Our research tackles this challenge by developing a method that automatically creates digital interfaces acting as translators. These interfaces convert a designer's creative vision directly into the technical instructions computers need to build accurate digital models. Instead of designers struggling with complex software, they can focus on designing. The result? Designs that better capture the original creative vision. Beyond design studios, this approach of bridging communication gaps between creative and technical teams could reshape collaboration in manufacturing.
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Primary Area: Applications->Everything Else
Keywords: Fast Prototyping, Domain-Specific Representation, Human Language Specification, Smart Manufacturing
Submission Number: 61
Loading