ChatPaper.aiChatPaper

EcoAssistant:更經濟、更準確地使用LLM助手

EcoAssistant: Using LLM Assistant More Affordably and Accurately

October 3, 2023
作者: Jieyu Zhang, Ranjay Krishna, Ahmed H. Awadallah, Chi Wang
cs.AI

摘要

今天,用戶將大型語言模型(LLMs)作為助手來回答需要外部知識的查詢;他們詢問特定城市的天氣、股票價格,甚至詢問附近特定位置的位置。這些查詢需要LLM生成調用外部API以回答用戶問題的代碼,然而LLMs很少在第一次嘗試時生成正確的代碼,需要在執行結果上進行迭代代碼優化。此外,使用LLM助手支持高查詢量可能會很昂貴。在這項工作中,我們提出了一個名為EcoAssistant的框架,使LLMs能夠更經濟、更準確地回答基於代碼的查詢。EcoAssistant包含三個組件。首先,它允許LLM助手與自動代碼執行器對話,以迭代地優化代碼或基於執行結果生成答案。其次,我們使用一個LLM助手的層次結構,該結構嘗試使用較弱、更便宜的LLMs回答查詢,然後再轉向更強大、更昂貴的LLMs。第三,我們從過去成功查詢中檢索解決方案作為上下文演示,以幫助後續查詢。根據經驗,我們展示了EcoAssistant在經濟性和準確性方面具有明顯優勢,成功率比GPT-4高出10個百分點,成本不到GPT-4的50%。
English
Today, users ask Large language models (LLMs) as assistants to answer queries that require external knowledge; they ask about the weather in a specific city, about stock prices, and even about where specific locations are within their neighborhood. These queries require the LLM to produce code that invokes external APIs to answer the user's question, yet LLMs rarely produce correct code on the first try, requiring iterative code refinement upon execution results. In addition, using LLM assistants to support high query volumes can be expensive. In this work, we contribute a framework, EcoAssistant, that enables LLMs to answer code-driven queries more affordably and accurately. EcoAssistant contains three components. First, it allows the LLM assistants to converse with an automatic code executor to iteratively refine code or to produce answers based on the execution results. Second, we use a hierarchy of LLM assistants, which attempts to answer the query with weaker, cheaper LLMs before backing off to stronger, expensive ones. Third, we retrieve solutions from past successful queries as in-context demonstrations to help subsequent queries. Empirically, we show that EcoAssistant offers distinct advantages for affordability and accuracy, surpassing GPT-4 by 10 points of success rate with less than 50% of GPT-4's cost.
PDF61December 15, 2024