ChatPaper.aiChatPaper

ATOM:基于大语言模型的自适应优化动态时序知识图谱构建方法

ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs

October 26, 2025
作者: Yassir Lairgi, Ludovic Moncla, Khalid Benabdeslem, Rémy Cazabet, Pierre Cléau
cs.AI

摘要

在当今快速扩张的数据环境中,从非结构化文本中提取知识对于实时分析、时序推理和动态记忆框架至关重要。然而,传统静态知识图谱构建方法常忽视现实数据的动态性和时效性,限制了其对持续变化的适应能力。此外,近期避免领域特定微调或预构建本体依赖的零样本/少样本方法普遍存在多次运行结果不稳定、关键事实覆盖不完整的问题。为应对这些挑战,我们提出ATOM(自适应优化)方法——一种基于少样本学习且可扩展的时序知识图谱构建方案,能够从非结构化文本中持续更新知识。ATOM将输入文档拆分为最小化的自包含"原子事实",提升知识提取的完备性与稳定性;随后通过区分信息观测时间与有效时间的双时间建模,从原子事实构建时序知识图谱;最终通过并行合并形成完整图谱。实证评估表明,相较于基线方法,ATOM实现了约18%的完备性提升、约17%的稳定性改进及超过90%的延迟降低,展现出动态时序知识图谱构建的强大扩展潜力。
English
In today's rapidly expanding data landscape, knowledge extraction from unstructured text is vital for real-time analytics, temporal inference, and dynamic memory frameworks. However, traditional static knowledge graph (KG) construction often overlooks the dynamic and time-sensitive nature of real-world data, limiting adaptability to continuous changes. Moreover, recent zero- or few-shot approaches that avoid domain-specific fine-tuning or reliance on prebuilt ontologies often suffer from instability across multiple runs, as well as incomplete coverage of key facts. To address these challenges, we introduce ATOM (AdapTive and OptiMized), a few-shot and scalable approach that builds and continuously updates Temporal Knowledge Graphs (TKGs) from unstructured texts. ATOM splits input documents into minimal, self-contained "atomic" facts, improving extraction exhaustivity and stability. Then, it constructs atomic TKGs from these facts while employing a dual-time modeling that distinguishes when information is observed from when it is valid. The resulting atomic TKGs are subsequently merged in parallel. Empirical evaluations demonstrate that ATOM achieves ~18% higher exhaustivity, ~17% better stability, and over 90% latency reduction compared to baseline methods, demonstrating a strong scalability potential for dynamic TKG construction.
PDF41December 1, 2025