ChatPaper.aiChatPaper

BYOKG-RAG:面向知识图谱问答的多策略图检索方法

BYOKG-RAG: Multi-Strategy Graph Retrieval for Knowledge Graph Question Answering

July 5, 2025
作者: Costas Mavromatis, Soji Adeshina, Vassilis N. Ioannidis, Zhen Han, Qi Zhu, Ian Robinson, Bryan Thompson, Huzefa Rangwala, George Karypis
cs.AI

摘要

知识图谱问答(KGQA)因输入图谱的结构和语义差异面临重大挑战。现有研究依赖大型语言模型(LLM)代理进行图遍历与检索,这一方法对遍历初始化敏感,易产生实体链接错误,且难以良好泛化至自定义(“自带”)知识图谱。我们提出BYOKG-RAG框架,通过协同结合LLMs与专用图检索工具,增强KGQA能力。在BYOKG-RAG中,LLMs生成关键图构件(问题实体、候选答案、推理路径及OpenCypher查询),图工具则将这些构件链接至知识图谱并检索相关图上下文。检索到的上下文使LLM能够在最终答案生成前,迭代优化其图链接与检索。通过从不同图工具中检索上下文,BYOKG-RAG为自定义知识图谱上的问答提供了更通用且稳健的解决方案。在涵盖多种知识图谱类型的五个基准测试中,BYOKG-RAG较次优图检索方法提升了4.5个百分点,同时展现出对自定义知识图谱更好的泛化能力。BYOKG-RAG框架已开源,地址为https://github.com/awslabs/graphrag-toolkit。
English
Knowledge graph question answering (KGQA) presents significant challenges due to the structural and semantic variations across input graphs. Existing works rely on Large Language Model (LLM) agents for graph traversal and retrieval; an approach that is sensitive to traversal initialization, as it is prone to entity linking errors and may not generalize well to custom ("bring-your-own") KGs. We introduce BYOKG-RAG, a framework that enhances KGQA by synergistically combining LLMs with specialized graph retrieval tools. In BYOKG-RAG, LLMs generate critical graph artifacts (question entities, candidate answers, reasoning paths, and OpenCypher queries), and graph tools link these artifacts to the KG and retrieve relevant graph context. The retrieved context enables the LLM to iteratively refine its graph linking and retrieval, before final answer generation. By retrieving context from different graph tools, BYOKG-RAG offers a more general and robust solution for QA over custom KGs. Through experiments on five benchmarks spanning diverse KG types, we demonstrate that BYOKG-RAG outperforms the second-best graph retrieval method by 4.5% points while showing better generalization to custom KGs. BYOKG-RAG framework is open-sourced at https://github.com/awslabs/graphrag-toolkit.
PDF41July 16, 2025