无限可动性:通过程序化生成实现关节物体的可扩展高保真合成
Infinite Mobility: Scalable High-Fidelity Synthesis of Articulated Objects via Procedural Generation
March 17, 2025
作者: Xinyu Lian, Zichao Yu, Ruiming Liang, Yitong Wang, Li Ray Luo, Kaixu Chen, Yuanzhen Zhou, Qihong Tang, Xudong Xu, Zhaoyang Lyu, Bo Dai, Jiangmiao Pang
cs.AI
摘要
在涉及具身智能的多种任务中,高质量的大规模铰接物体需求迫切。现有创建铰接物体的方法大多基于数据驱动或模拟,这些方法受限于训练数据的规模与质量,或模拟的逼真度与繁重劳动。本文提出“无限机动性”这一创新方法,通过程序化生成合成高保真铰接物体。用户研究与定量评估表明,该方法生成的结果在物理属性与网格质量上均超越当前最先进技术,并与人工标注数据集相当。此外,我们证明合成数据可用作生成模型的训练数据,为下一步的规模扩展奠定基础。代码已发布于https://github.com/Intern-Nexus/Infinite-Mobility。
English
Large-scale articulated objects with high quality are desperately needed for
multiple tasks related to embodied AI. Most existing methods for creating
articulated objects are either data-driven or simulation based, which are
limited by the scale and quality of the training data or the fidelity and heavy
labour of the simulation. In this paper, we propose Infinite Mobility, a novel
method for synthesizing high-fidelity articulated objects through procedural
generation. User study and quantitative evaluation demonstrate that our method
can produce results that excel current state-of-the-art methods and are
comparable to human-annotated datasets in both physics property and mesh
quality. Furthermore, we show that our synthetic data can be used as training
data for generative models, enabling next-step scaling up. Code is available at
https://github.com/Intern-Nexus/Infinite-MobilitySummary
AI-Generated Summary