ChatPaper.aiChatPaper

MeshCoder:基於大型語言模型的點雲結構化網格代碼生成

MeshCoder: LLM-Powered Structured Mesh Code Generation from Point Clouds

August 20, 2025
作者: Bingquan Dai, Li Ray Luo, Qihong Tang, Jie Wang, Xinyu Lian, Hao Xu, Minghan Qin, Xudong Xu, Bo Dai, Haoqian Wang, Zhaoyang Lyu, Jiangmiao Pang
cs.AI

摘要

將3D物體重建為可編輯程序對於逆向工程和形狀編輯等應用至關重要。然而,現有方法通常依賴於有限的領域特定語言(DSLs)和小規模數據集,限制了其建模複雜幾何和結構的能力。為應對這些挑戰,我們引入了MeshCoder,這是一個新穎的框架,能夠從點雲重建複雜的3D物體為可編輯的Blender Python腳本。我們開發了一套全面的Blender Python API,能夠合成複雜的幾何形狀。利用這些API,我們構建了一個大規模的配對物體-代碼數據集,其中每個物體的代碼被分解為不同的語義部分。隨後,我們訓練了一個多模態大語言模型(LLM),將3D點雲轉換為可執行的Blender Python腳本。我們的方法不僅在形狀到代碼的重建任務中實現了卓越的性能,還通過便捷的代碼修改促進了直觀的幾何和拓撲編輯。此外,我們基於代碼的表示增強了LLM在3D形狀理解任務中的推理能力。這些貢獻共同使MeshCoder成為程序化3D形狀重建和理解的強大而靈活的解決方案。
English
Reconstructing 3D objects into editable programs is pivotal for applications like reverse engineering and shape editing. However, existing methods often rely on limited domain-specific languages (DSLs) and small-scale datasets, restricting their ability to model complex geometries and structures. To address these challenges, we introduce MeshCoder, a novel framework that reconstructs complex 3D objects from point clouds into editable Blender Python scripts. We develop a comprehensive set of expressive Blender Python APIs capable of synthesizing intricate geometries. Leveraging these APIs, we construct a large-scale paired object-code dataset, where the code for each object is decomposed into distinct semantic parts. Subsequently, we train a multimodal large language model (LLM) that translates 3D point cloud into executable Blender Python scripts. Our approach not only achieves superior performance in shape-to-code reconstruction tasks but also facilitates intuitive geometric and topological editing through convenient code modifications. Furthermore, our code-based representation enhances the reasoning capabilities of LLMs in 3D shape understanding tasks. Together, these contributions establish MeshCoder as a powerful and flexible solution for programmatic 3D shape reconstruction and understanding.
PDF563August 21, 2025