ChatPaper.aiChatPaper

Babel:開放式多語言大型語言模型,服務覆蓋全球逾90%的語言使用者

Babel: Open Multilingual Large Language Models Serving Over 90% of Global Speakers

March 2, 2025
作者: Yiran Zhao, Chaoqun Liu, Yue Deng, Jiahao Ying, Mahani Aljunied, Zhaodonghui Li, Lidong Bing, Hou Pong Chan, Yu Rong, Deli Zhao, Wenxuan Zhang
cs.AI

摘要

大型語言模型(LLMs)已徹底革新了自然語言處理(NLP)領域,然而開源的多語言LLMs仍然稀缺,現有模型往往在語言覆蓋範圍上受限。這類模型通常優先考慮資源豐富的語言,而廣泛使用但資源匱乏的語言則常被忽視。為解決這一差距,我們推出了Babel,這是一個開源的多語言LLM,涵蓋了按使用人數排名的前25種語言,支持全球超過90%的人口,並包含許多被其他開源多語言LLMs忽視的語言。與傳統的持續預訓練方法不同,Babel通過層擴展技術增加其參數數量,從而提升了Babel的性能上限。我們推出了兩個變體:Babel-9B,專為高效推理和微調設計;以及Babel-83B,為開源多語言LLMs設立了新標準。在多語言任務上的廣泛評估顯示,其性能優於同規模的開源LLMs。此外,利用開源的監督微調數據集,Babel表現出色,其中Babel-9B-Chat在10B規模的LLMs中領先,而Babel-83B-Chat在多語言任務上設立了新標準,達到了商業模型的同等水平。
English
Large language models (LLMs) have revolutionized natural language processing (NLP), yet open-source multilingual LLMs remain scarce, with existing models often limited in language coverage. Such models typically prioritize well-resourced languages, while widely spoken but under-resourced languages are often overlooked. To address this disparity, we introduce Babel, an open multilingual LLM that covers the top 25 languages by number of speakers, supports over 90% of the global population, and includes many languages neglected by other open multilingual LLMs. Unlike traditional continue pretraining approaches, Babel expands its parameter count through a layer extension technique that elevates Babel's performance ceiling. We introduce two variants: Babel-9B, designed for efficient inference and fine-tuning, and Babel-83B, which sets a new standard for open multilingual LLMs. Extensive evaluations on multilingual tasks demonstrate its superior performance compared to open LLMs of comparable size. In addition, using open-source supervised fine-tuning datasets, Babel achieves remarkable performance, with Babel-9B-Chat leading among 10B-sized LLMs and Babel-83B-Chat setting a new standard for multilingual tasks, reaching the same level of commercial models.
PDF653March 6, 2025