ChatPaper.aiChatPaper

SayTap:从语言到四足动物的运动

SayTap: Language to Quadrupedal Locomotion

June 13, 2023
作者: Yujin Tang, Wenhao Yu, Jie Tan, Heiga Zen, Aleksandra Faust, Tatsuya Harada
cs.AI

摘要

大型语言模型(LLMs)已经展示了执行高级规划的潜力。然而,对于LLMs来说,理解低级指令,比如关节角度目标或电机扭矩,仍然是一个挑战。本文提出了一种方法,利用脚底接触模式作为一个接口,连接自然语言中的人类指令和一个输出这些低级指令的运动控制器。这导致了一个针对四足机器人的交互式系统,允许用户灵活地设计多样化的运动行为。我们提出了一个LLM提示设计,一个奖励函数,以及一种使控制器接触模式的可行分布的方法。结果是一个能够实现多样化运动模式的控制器,可以转移到真实机器人硬件上。与其他设计选择相比,所提出的方法在预测正确接触模式方面拥有超过50%的成功率,并且可以解决30项任务中的额外10项任务。我们的项目网站是:https://saytap.github.io。
English
Large language models (LLMs) have demonstrated the potential to perform high-level planning. Yet, it remains a challenge for LLMs to comprehend low-level commands, such as joint angle targets or motor torques. This paper proposes an approach to use foot contact patterns as an interface that bridges human commands in natural language and a locomotion controller that outputs these low-level commands. This results in an interactive system for quadrupedal robots that allows the users to craft diverse locomotion behaviors flexibly. We contribute an LLM prompt design, a reward function, and a method to expose the controller to the feasible distribution of contact patterns. The results are a controller capable of achieving diverse locomotion patterns that can be transferred to real robot hardware. Compared with other design choices, the proposed approach enjoys more than 50% success rate in predicting the correct contact patterns and can solve 10 more tasks out of a total of 30 tasks. Our project site is: https://saytap.github.io.
PDF70December 15, 2024