IRASim:学习交互式真实机器人动作模拟器
IRASim: Learning Interactive Real-Robot Action Simulators
June 20, 2024
作者: Fangqi Zhu, Hongtao Wu, Song Guo, Yuxiao Liu, Chilam Cheang, Tao Kong
cs.AI
摘要
在现实世界中,可扩展的机器人学习受到真实机器人的成本和安全问题的限制。此外,在现实世界中推出机器人轨迹可能耗时且劳动密集。本文提出了学习交互式真实机器人动作模拟器作为一种替代方案。我们引入了一种新方法,即IRASim,它利用生成模型的能力生成一个机器人手臂执行给定动作轨迹的极其逼真的视频,从一个初始给定帧开始。为了验证我们方法的有效性,我们创建了一个新的基准,即IRASim基准,基于三个真实机器人数据集,并在该基准上进行了大量实验。结果显示,IRASim优于所有基线方法,并在人类评估中更受青睐。我们希望IRASim能够作为增强现实世界中机器人学习的有效和可扩展方法。为了促进生成式真实机器人动作模拟器的研究,我们在https://gen-irasim.github.io开源代码、基准和检查点。
English
Scalable robot learning in the real world is limited by the cost and safety
issues of real robots. In addition, rolling out robot trajectories in the real
world can be time-consuming and labor-intensive. In this paper, we propose to
learn an interactive real-robot action simulator as an alternative. We
introduce a novel method, IRASim, which leverages the power of generative
models to generate extremely realistic videos of a robot arm that executes a
given action trajectory, starting from an initial given frame. To validate the
effectiveness of our method, we create a new benchmark, IRASim Benchmark, based
on three real-robot datasets and perform extensive experiments on the
benchmark. Results show that IRASim outperforms all the baseline methods and is
more preferable in human evaluations. We hope that IRASim can serve as an
effective and scalable approach to enhance robot learning in the real world. To
promote research for generative real-robot action simulators, we open-source
code, benchmark, and checkpoints at https: //gen-irasim.github.io.Summary
AI-Generated Summary