ChatPaper.aiChatPaper

MeshCoder:基于大语言模型的点云到结构化网格代码生成

MeshCoder: LLM-Powered Structured Mesh Code Generation from Point Clouds

August 20, 2025
作者: Bingquan Dai, Li Ray Luo, Qihong Tang, Jie Wang, Xinyu Lian, Hao Xu, Minghan Qin, Xudong Xu, Bo Dai, Haoqian Wang, Zhaoyang Lyu, Jiangmiao Pang
cs.AI

摘要

将三维物体重建为可编辑程序对于逆向工程和形状编辑等应用至关重要。然而,现有方法通常依赖于有限的领域特定语言(DSLs)和小规模数据集,限制了其建模复杂几何和结构的能力。为解决这些挑战,我们提出了MeshCoder,一个新颖的框架,能够从点云重建复杂三维物体为可编辑的Blender Python脚本。我们开发了一套全面的、表达能力强的Blender Python API,能够合成精细的几何结构。利用这些API,我们构建了一个大规模的对象-代码配对数据集,其中每个对象的代码被分解为不同的语义部分。随后,我们训练了一个多模态大语言模型(LLM),将三维点云转换为可执行的Blender Python脚本。我们的方法不仅在形状到代码的重建任务中表现出色,还通过便捷的代码修改实现了直观的几何和拓扑编辑。此外,基于代码的表示增强了LLM在三维形状理解任务中的推理能力。这些贡献共同确立了MeshCoder作为程序化三维形状重建与理解的强大而灵活的解决方案。
English
Reconstructing 3D objects into editable programs is pivotal for applications like reverse engineering and shape editing. However, existing methods often rely on limited domain-specific languages (DSLs) and small-scale datasets, restricting their ability to model complex geometries and structures. To address these challenges, we introduce MeshCoder, a novel framework that reconstructs complex 3D objects from point clouds into editable Blender Python scripts. We develop a comprehensive set of expressive Blender Python APIs capable of synthesizing intricate geometries. Leveraging these APIs, we construct a large-scale paired object-code dataset, where the code for each object is decomposed into distinct semantic parts. Subsequently, we train a multimodal large language model (LLM) that translates 3D point cloud into executable Blender Python scripts. Our approach not only achieves superior performance in shape-to-code reconstruction tasks but also facilitates intuitive geometric and topological editing through convenient code modifications. Furthermore, our code-based representation enhances the reasoning capabilities of LLMs in 3D shape understanding tasks. Together, these contributions establish MeshCoder as a powerful and flexible solution for programmatic 3D shape reconstruction and understanding.
PDF563August 21, 2025