ChatPaper.aiChatPaper

物理世界:通过物理感知演示合成从真实视频到可变形物体的世界模型

PhysWorld: From Real Videos to World Models of Deformable Objects via Physics-Aware Demonstration Synthesis

October 24, 2025
作者: Yu Yang, Zhilu Zhang, Xiang Zhang, Yihan Zeng, Hui Li, Wangmeng Zuo
cs.AI

摘要

交互式世界模型能够模拟物体动力学,在机器人、虚拟现实和增强现实领域具有关键作用。然而从有限的真实世界视频数据中学习物理一致的动力学模型仍面临重大挑战,尤其对于具有空间变化物理属性的可变形物体。为克服数据稀缺难题,我们提出PhysWorld新型框架,利用模拟器合成物理合理且多样化的演示数据以学习高效世界模型。具体而言,我们首先通过本构模型选择和物理属性全局到局部优化,在MPM模拟器中构建物理一致的数字孪生体;随后对物理属性施加部件感知扰动,为数字孪生体生成多样化运动模式,从而合成大规模异质演示数据;最后基于这些演示数据训练嵌入物理属性的轻量级图神经网络世界模型,并可利用真实视频进一步优化物理参数。PhysWorld实现了对各类可变形物体的精准快速未来预测,且对新交互场景具有良好泛化能力。实验表明,PhysWorld在保持竞争力的同时,推理速度较当前最优方法PhysTwin提升47倍。
English
Interactive world models that simulate object dynamics are crucial for robotics, VR, and AR. However, it remains a significant challenge to learn physics-consistent dynamics models from limited real-world video data, especially for deformable objects with spatially-varying physical properties. To overcome the challenge of data scarcity, we propose PhysWorld, a novel framework that utilizes a simulator to synthesize physically plausible and diverse demonstrations to learn efficient world models. Specifically, we first construct a physics-consistent digital twin within MPM simulator via constitutive model selection and global-to-local optimization of physical properties. Subsequently, we apply part-aware perturbations to the physical properties and generate various motion patterns for the digital twin, synthesizing extensive and diverse demonstrations. Finally, using these demonstrations, we train a lightweight GNN-based world model that is embedded with physical properties. The real video can be used to further refine the physical properties. PhysWorld achieves accurate and fast future predictions for various deformable objects, and also generalizes well to novel interactions. Experiments show that PhysWorld has competitive performance while enabling inference speeds 47 times faster than the recent state-of-the-art method, i.e., PhysTwin.
PDF41December 17, 2025