RakutenAI-7B: Extensión de Modelos de Lenguaje a Gran Escala para el Japonés
RakutenAI-7B: Extending Large Language Models for Japanese
March 21, 2024
Autores: Rakuten Group, Aaron Levine, Connie Huang, Chenguang Wang, Eduardo Batista, Ewa Szymanska, Hongyi Ding, Hou Wei Chou, Jean-François Pessiot, Johanes Effendi, Justin Chiu, Kai Torben Ohlhus, Karan Chopra, Keiji Shinzato, Koji Murakami, Lee Xiong, Lei Chen, Maki Kubota, Maksim Tkachenko, Miroku Lee, Naoki Takahashi, Prathyusha Jwalapuram, Ryutaro Tatsushima, Saurabh Jain, Sunil Kumar Yadav, Ting Cai, Wei-Te Chen, Yandi Xia, Yuki Nakayama, Yutaka Higashiyama
cs.AI
Resumen
Presentamos RakutenAI-7B, una suite de modelos de lenguaje grande orientados al japonés que logran el mejor rendimiento en los benchmarks de Japanese LM Harness entre los modelos abiertos de 7B. Junto con el modelo base, lanzamos modelos ajustados para instrucciones y chat, RakutenAI-7B-instruct y RakutenAI-7B-chat respectivamente, bajo la licencia Apache 2.0.
English
We introduce RakutenAI-7B, a suite of Japanese-oriented large language models
that achieve the best performance on the Japanese LM Harness benchmarks among
the open 7B models. Along with the foundation model, we release instruction-
and chat-tuned models, RakutenAI-7B-instruct and RakutenAI-7B-chat
respectively, under the Apache 2.0 license.Summary
AI-Generated Summary