ChatPaper.aiChatPaper

SayTap:從語言到四足動物行走

SayTap: Language to Quadrupedal Locomotion

June 13, 2023
作者: Yujin Tang, Wenhao Yu, Jie Tan, Heiga Zen, Aleksandra Faust, Tatsuya Harada
cs.AI

摘要

大型語言模型(LLMs)已展示了執行高層次規劃的潛力。然而,對於LLMs來說,理解低層次指令(例如關節角度目標或馬達扭矩)仍然是一個挑戰。本文提出了一種方法,使用腳部接觸模式作為一個介面,將自然語言中的人類指令與輸出這些低層次指令的運動控制器相連接。這導致了一個互動系統,適用於四足機器人,允許用戶靈活地打造多樣的運動行為。我們提供了一種LLM提示設計、一個獎勵函數,以及一種將控制器暴露於可行接觸模式分佈的方法。結果是一個能夠實現多樣運動模式的控制器,可以轉移到真實機器人硬體上。與其他設計選擇相比,所提出的方法在預測正確接觸模式方面擁有超過50%的成功率,並且可以解決30個任務中的多達10個任務。我們的項目網站是:https://saytap.github.io。
English
Large language models (LLMs) have demonstrated the potential to perform high-level planning. Yet, it remains a challenge for LLMs to comprehend low-level commands, such as joint angle targets or motor torques. This paper proposes an approach to use foot contact patterns as an interface that bridges human commands in natural language and a locomotion controller that outputs these low-level commands. This results in an interactive system for quadrupedal robots that allows the users to craft diverse locomotion behaviors flexibly. We contribute an LLM prompt design, a reward function, and a method to expose the controller to the feasible distribution of contact patterns. The results are a controller capable of achieving diverse locomotion patterns that can be transferred to real robot hardware. Compared with other design choices, the proposed approach enjoys more than 50% success rate in predicting the correct contact patterns and can solve 10 more tasks out of a total of 30 tasks. Our project site is: https://saytap.github.io.
PDF70December 15, 2024