BigTrans:利用多语言翻译功能增强大型语言模型,覆盖100多种语言。
BigTrans: Augmenting Large Language Models with Multilingual Translation Capability over 100 Languages
May 29, 2023
作者: Wen Yang, Chong Li, Jiajun Zhang, Chengqing Zong
cs.AI
摘要
大型语言模型(LLMs)展示了在各种自然语言中具有良好的翻译性能。然而,许多LLMs,尤其是开源的模型,如BLOOM和LLaMA,主要以英语为主,并且仅支持几十种自然语言,导致LLMs在语言翻译方面的潜力尚未被充分探索。在这项工作中,我们提出了BigTrans,它基于覆盖仅20种语言的LLaMA,并增强了其在100多种语言上的多语言翻译能力。BigTrans是基于LLaMA-13B构建的,并经过三个步骤的优化。首先,我们使用大规模的中文单语数据继续训练LLaMA。其次,我们使用覆盖102种自然语言的大规模平行数据集继续训练模型。第三,我们使用多语言翻译指令对基础模型进行微调,形成我们的BigTrans模型。多语言翻译的初步实验表明,BigTrans在许多语言上的表现与ChatGPT和Google翻译相当,甚至在8种语言对中胜过ChatGPT。我们发布了BigTrans模型,并希望它能推动研究进展。
English
Large language models (LLMs) demonstrate promising translation performance
among various natural languages. However, many LLMs especially the open-sourced
ones, such as BLOOM and LLaMA, are English-dominant and support only dozens of
natural languages, making the potential of LLMs on language translation less
explored. In this work, we present BigTrans which adapts LLaMA that covers only
20 languages and enhances it with multilingual translation capability on more
than 100 languages. BigTrans is built upon LLaMA-13B and it is optimized in
three steps. First, we continue training LLaMA with massive Chinese monolingual
data. Second, we continue training the model with a large-scale parallel
dataset that covers 102 natural languages. Third, we instruct-tune the
foundation model with multilingual translation instructions, leading to our
BigTrans model. The preliminary experiments on multilingual translation show
that BigTrans performs comparably with ChatGPT and Google Translate in many
languages and even outperforms ChatGPT in 8 language pairs. We release the
BigTrans model and hope it can advance the research progress.