ChatPaper.aiChatPaper

FuxiMT: Sparsifying Large Language Models for Chinese-Centric Multilingual Machine Translation

May 20, 2025
Authors: Shaolin Zhu, Tianyu Dong, Bo Li, Deyi Xiong
cs.AI

Abstract

In this paper, we present FuxiMT, a novel Chinese-centric multilingual machine translation model powered by a sparsified large language model (LLM). We adopt a two-stage strategy to train FuxiMT. We first pre-train the model on a massive Chinese corpus and then conduct multilingual fine-tuning on a large parallel dataset encompassing 65 languages. FuxiMT incorporates Mixture-of-Experts (MoEs) and employs a curriculum learning strategy for robust performance across various resource levels. Experimental results demonstrate that FuxiMT significantly outperforms strong baselines, including state-of-the-art LLMs and machine translation models, particularly under low-resource scenarios. Furthermore, FuxiMT exhibits remarkable zero-shot translation capabilities for unseen language pairs, indicating its potential to bridge communication gaps where parallel data are scarce or unavailable.

Summary

AI-Generated Summary

PDF12May 26, 2025