ChatPaper.aiChatPaper

移动代理:具有视觉感知能力的自主多模移动设备代理

Mobile-Agent: Autonomous Multi-Modal Mobile Device Agent with Visual Perception

January 29, 2024
作者: Junyang Wang, Haiyang Xu, Jiabo Ye, Ming Yan, Weizhou Shen, Ji Zhang, Fei Huang, Jitao Sang
cs.AI

摘要

基于多模态大型语言模型(MLLM)的移动设备代理正成为一种流行的应用。本文介绍了Mobile-Agent,这是一个自主的多模态移动设备代理。Mobile-Agent首先利用视觉感知工具准确识别和定位应用程序前端界面中的视觉和文本元素。基于感知到的视觉上下文,它自主规划和分解复杂的操作任务,并逐步通过操作导航移动应用程序。与依赖应用程序的XML文件或移动系统元数据的先前解决方案不同,Mobile-Agent以视觉为中心的方式允许在各种移动操作环境中具有更大的适应性,从而消除了对特定系统定制的必要性。为了评估Mobile-Agent的性能,我们引入了Mobile-Eval,这是一个用于评估移动设备操作的基准。基于Mobile-Eval,我们对Mobile-Agent进行了全面评估。实验结果表明,Mobile-Agent实现了显著的准确性和完成率。即使面对具有挑战性的指令,如多应用程序操作,Mobile-Agent仍然能够完成要求。代码和模型将在https://github.com/X-PLUG/MobileAgent上开源。
English
Mobile device agent based on Multimodal Large Language Models (MLLM) is becoming a popular application. In this paper, we introduce Mobile-Agent, an autonomous multi-modal mobile device agent. Mobile-Agent first leverages visual perception tools to accurately identify and locate both the visual and textual elements within the app's front-end interface. Based on the perceived vision context, it then autonomously plans and decomposes the complex operation task, and navigates the mobile Apps through operations step by step. Different from previous solutions that rely on XML files of Apps or mobile system metadata, Mobile-Agent allows for greater adaptability across diverse mobile operating environments in a vision-centric way, thereby eliminating the necessity for system-specific customizations. To assess the performance of Mobile-Agent, we introduced Mobile-Eval, a benchmark for evaluating mobile device operations. Based on Mobile-Eval, we conducted a comprehensive evaluation of Mobile-Agent. The experimental results indicate that Mobile-Agent achieved remarkable accuracy and completion rates. Even with challenging instructions, such as multi-app operations, Mobile-Agent can still complete the requirements. Code and model will be open-sourced at https://github.com/X-PLUG/MobileAgent.
PDF214December 15, 2024