OpenBA:一個開源的 15B 雙語非對稱 seq2seq 模型 從頭開始預訓練
OpenBA: An Open-sourced 15B Bilingual Asymmetric seq2seq Model Pre-trained from Scratch
September 19, 2023
作者: Juntao Li, Zecheng Tang, Yuyang Ding, Pinzheng Wang, Pei Guo, Wangjie You, Dan Qiao, Wenliang Chen, Guohong Fu, Qiaoming Zhu, Guodong Zhou, Min Zhang
cs.AI
摘要
擁有數十億參數的大型語言模型(LLMs)在各種自然語言處理任務上展現出優異表現。本報告介紹了OpenBA,一個開源的 15B 雙語非對稱 seq2seq 模型,旨在為中文導向的開源模型社區貢獻一種LLM變體。我們通過有效和高效的技術增強了OpenBA,並採用了三階段訓練策略從頭開始訓練模型。我們的解決方案在僅使用 380B tokens 時也能達到非常有競爭力的表現,優於 BELEBELE 基準上的 LLaMA-70B,MMLU 基準上的 BLOOM-176B,以及 C-Eval(hard)基準上的 GLM-130B。本報告提供了預訓練類似模型的主要細節,包括預訓練數據處理、雙語 Flan 數據收集、啟發我們模型架構設計的實證觀察、不同階段的訓練目標,以及其他增強技術。我們已重構代碼以符合Huggingface Transformers Library的設計原則,使開發者更方便使用,並在 https://huggingface.co/openBA 發布了不同訓練階段的檢查點。有關我們項目的更多細節可在 https://github.com/OpenNLG/openBA.git 找到。
English
Large language models (LLMs) with billions of parameters have demonstrated
outstanding performance on various natural language processing tasks. This
report presents OpenBA, an open-sourced 15B bilingual asymmetric seq2seq model,
to contribute an LLM variant to the Chinese-oriented open-source model
community. We enhance OpenBA with effective and efficient techniques as well as
adopt a three-stage training strategy to train the model from scratch. Our
solution can also achieve very competitive performance with only 380B tokens,
which is better than LLaMA-70B on the BELEBELE benchmark, BLOOM-176B on the
MMLU benchmark, GLM-130B on the C-Eval (hard) benchmark. This report provides
the main details to pre-train an analogous model, including pre-training data
processing, Bilingual Flan data collection, the empirical observations that
inspire our model architecture design, training objectives of different stages,
and other enhancement techniques. We have refactored our code to follow the
design principles of the Huggingface Transformers Library, making it more
convenient for developers to use, and released checkpoints of different
training stages at https://huggingface.co/openBA. More details of our project
are available at https://github.com/OpenNLG/openBA.git.