ChatPaper.aiChatPaper

CAD編輯器:具有自動訓練數據綜合的定位-填充框架,適用於基於文本的CAD編輯

CAD-Editor: A Locate-then-Infill Framework with Automated Training Data Synthesis for Text-Based CAD Editing

February 6, 2025
作者: Yu Yuan, Shizhao Sun, Qi Liu, Jiang Bian
cs.AI

摘要

計算機輔助設計(CAD)在各個行業中不可或缺。 基於文本指令自動修改CAD模型的文本CAD編輯具有巨大潛力,但尚未得到充分探索。 現有方法主要集中在設計變異生成或基於文本的CAD生成,要麼缺乏對基於文本的控制的支持,要麼忽略現有CAD模型作為約束條件。 我們介紹了CAD-Editor,這是首個面向文本的CAD編輯框架。為應對訓練所需的具有準確對應的三元數據的挑戰,我們提出了一個自動數據合成流程。該流程利用設計變異模型生成原始CAD模型和編輯後的CAD模型對,並利用大型視覺語言模型(LVLMs)將它們的差異總結為編輯指令。 為應對基於文本的CAD編輯的複合性質,我們提出了一個定位-填充框架,將任務分解為兩個專注的子任務:定位需要修改的區域,並填充這些區域以適當的編輯。大型語言模型(LLMs)作為這兩個子任務的支柱,利用其在自然語言理解和CAD知識方面的能力。 實驗表明,CAD-Editor在量化和質化方面均取得了優異的性能。
English
Computer Aided Design (CAD) is indispensable across various industries. Text-based CAD editing, which automates the modification of CAD models based on textual instructions, holds great potential but remains underexplored. Existing methods primarily focus on design variation generation or text-based CAD generation, either lacking support for text-based control or neglecting existing CAD models as constraints. We introduce CAD-Editor, the first framework for text-based CAD editing. To address the challenge of demanding triplet data with accurate correspondence for training, we propose an automated data synthesis pipeline. This pipeline utilizes design variation models to generate pairs of original and edited CAD models and employs Large Vision-Language Models (LVLMs) to summarize their differences into editing instructions. To tackle the composite nature of text-based CAD editing, we propose a locate-then-infill framework that decomposes the task into two focused sub-tasks: locating regions requiring modification and infilling these regions with appropriate edits. Large Language Models (LLMs) serve as the backbone for both sub-tasks, leveraging their capabilities in natural language understanding and CAD knowledge. Experiments show that CAD-Editor achieves superior performance both quantitatively and qualitatively.

Summary

AI-Generated Summary

PDF92February 12, 2025