ChatPaper.aiChatPaper

ZARA:基於知識與檢索驅動的大型語言模型代理之零樣本運動時間序列分析

ZARA: Zero-shot Motion Time-Series Analysis via Knowledge and Retrieval Driven LLM Agents

August 6, 2025
作者: Zechen Li, Baiyu Chen, Hao Xue, Flora D. Salim
cs.AI

摘要

运动传感器时间序列在人类活动识别(HAR)中占据核心地位,其应用涵盖健康、体育及智能设备等多个领域。然而,现有方法通常针对固定的活动集进行训练,当出现新行为或传感器配置时,需耗费大量资源重新训练。近期尝试利用大型语言模型(LLMs)进行HAR,通常通过将信号转换为文本或图像,但存在准确率有限且缺乏可验证解释性的问题。我们提出了ZARA,这是首个基于代理的框架,能够直接从原始运动时间序列实现零样本、可解释的HAR。ZARA集成了一个自动生成的成对特征知识库,该知识库捕捉了每对活动的区分性统计信息;一个多传感器检索模块,用于提取相关证据;以及一个分层代理管道,引导LLM迭代选择特征、参考这些证据,并生成活动预测和自然语言解释。ZARA无需任何微调或特定任务分类器,即可实现灵活且可解释的HAR。在8个HAR基准上的广泛实验表明,ZARA实现了最先进的零样本性能,在提供清晰推理的同时,其宏F1分数超出最强基线2.53倍。消融研究进一步证实了每个模块的必要性,标志着ZARA朝着可信赖、即插即用的运动时间序列分析迈出了有希望的一步。我们的代码可在https://github.com/zechenli03/ZARA获取。
English
Motion sensor time-series are central to human activity recognition (HAR), with applications in health, sports, and smart devices. However, existing methods are trained for fixed activity sets and require costly retraining when new behaviours or sensor setups appear. Recent attempts to use large language models (LLMs) for HAR, typically by converting signals into text or images, suffer from limited accuracy and lack verifiable interpretability. We propose ZARA, the first agent-based framework for zero-shot, explainable HAR directly from raw motion time-series. ZARA integrates an automatically derived pair-wise feature knowledge base that captures discriminative statistics for every activity pair, a multi-sensor retrieval module that surfaces relevant evidence, and a hierarchical agent pipeline that guides the LLM to iteratively select features, draw on this evidence, and produce both activity predictions and natural-language explanations. ZARA enables flexible and interpretable HAR without any fine-tuning or task-specific classifiers. Extensive experiments on 8 HAR benchmarks show that ZARA achieves SOTA zero-shot performance, delivering clear reasoning while exceeding the strongest baselines by 2.53x in macro F1. Ablation studies further confirm the necessity of each module, marking ZARA as a promising step toward trustworthy, plug-and-play motion time-series analysis. Our codes are available at https://github.com/zechenli03/ZARA.
PDF11August 20, 2025