ChatPaper.aiChatPaper

IRASim:學習互動式真實機器人動作模擬器

IRASim: Learning Interactive Real-Robot Action Simulators

June 20, 2024
作者: Fangqi Zhu, Hongtao Wu, Song Guo, Yuxiao Liu, Chilam Cheang, Tao Kong
cs.AI

摘要

在現實世界中,可擴展的機器人學習受到真實機器人的成本和安全問題的限制。此外,在現實世界中推出機器人軌跡可能耗時且費力。本文提出學習交互式真實機器人動作模擬器作為替代方案。我們引入一種新方法 IRASim,利用生成模型的威力生成機器人手臂執行給定動作軌跡的極其逼真的視頻,從初始給定幀開始。為驗證我們方法的有效性,我們基於三個真實機器人數據集創建了一個新的基準 IRASim 基準,並在該基準上進行了大量實驗。結果顯示,IRASim 優於所有基準方法,在人類評估中更受青睞。我們希望 IRASim 能夠作為增強現實世界中機器人學習的有效且可擴展的方法。為促進生成式真實機器人動作模擬器的研究,我們在 https://gen-irasim.github.io 上開源代碼、基準和檢查點。
English
Scalable robot learning in the real world is limited by the cost and safety issues of real robots. In addition, rolling out robot trajectories in the real world can be time-consuming and labor-intensive. In this paper, we propose to learn an interactive real-robot action simulator as an alternative. We introduce a novel method, IRASim, which leverages the power of generative models to generate extremely realistic videos of a robot arm that executes a given action trajectory, starting from an initial given frame. To validate the effectiveness of our method, we create a new benchmark, IRASim Benchmark, based on three real-robot datasets and perform extensive experiments on the benchmark. Results show that IRASim outperforms all the baseline methods and is more preferable in human evaluations. We hope that IRASim can serve as an effective and scalable approach to enhance robot learning in the real world. To promote research for generative real-robot action simulators, we open-source code, benchmark, and checkpoints at https: //gen-irasim.github.io.

Summary

AI-Generated Summary

PDF61November 29, 2024