ChatPaper.aiChatPaper

ATOM:基于大语言模型的自适应优化动态时序知识图谱构建方法

ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs

October 26, 2025
作者: Yassir Lairgi, Ludovic Moncla, Khalid Benabdeslem, Rémy Cazabet, Pierre Cléau
cs.AI

摘要

在當今快速擴展的數據環境中,從非結構化文本中提取知識對於實時分析、時序推理和動態記憶框架至關重要。然而,傳統靜態知識圖譜的構建往往忽略現實世界數據的動態性和時效性,限制了對持續變化的適應能力。此外,近期避免領域特定微調或依賴預構本體的零樣本/少樣本方法,普遍存在多次運行結果不穩定及關鍵事實覆蓋不全的問題。為應對這些挑戰,我們提出ATOM(自適應優化型)方法——一種可擴展的少樣本方案,能從非結構化文本構建並持續更新時序知識圖譜。該方法將輸入文檔拆分為最小化的自包含"原子"事實,提升提取的完整度與穩定性;隨後通過區分信息觀測時間與有效時間的雙重時間建模,從這些事實構建原子時序知識圖譜,最終進行並行融合。實證評估表明,相較基準方法,ATOM實現了約18%的完整度提升、約17%的穩定性改善及超過90%的延遲降低,展現出動態構建時序知識圖譜的強大擴展潛力。
English
In today's rapidly expanding data landscape, knowledge extraction from unstructured text is vital for real-time analytics, temporal inference, and dynamic memory frameworks. However, traditional static knowledge graph (KG) construction often overlooks the dynamic and time-sensitive nature of real-world data, limiting adaptability to continuous changes. Moreover, recent zero- or few-shot approaches that avoid domain-specific fine-tuning or reliance on prebuilt ontologies often suffer from instability across multiple runs, as well as incomplete coverage of key facts. To address these challenges, we introduce ATOM (AdapTive and OptiMized), a few-shot and scalable approach that builds and continuously updates Temporal Knowledge Graphs (TKGs) from unstructured texts. ATOM splits input documents into minimal, self-contained "atomic" facts, improving extraction exhaustivity and stability. Then, it constructs atomic TKGs from these facts while employing a dual-time modeling that distinguishes when information is observed from when it is valid. The resulting atomic TKGs are subsequently merged in parallel. Empirical evaluations demonstrate that ATOM achieves ~18% higher exhaustivity, ~17% better stability, and over 90% latency reduction compared to baseline methods, demonstrating a strong scalability potential for dynamic TKG construction.
PDF41December 1, 2025