ChatPaper.aiChatPaper

Meltemi:首个用于希腊语的开放式大型语言模型

Meltemi: The first open Large Language Model for Greek

July 30, 2024
作者: Leon Voukoutis, Dimitris Roussis, Georgios Paraskevopoulos, Sokratis Sofianopoulos, Prokopis Prokopidis, Vassilis Papavasileiou, Athanasios Katsamanis, Stelios Piperidis, Vassilis Katsouros
cs.AI

摘要

我们描述了第一个针对希腊语开发的开放式大型语言模型Meltemi 7B的发展和能力。Meltemi 7B拥有70亿参数,并在一个包含40亿标记的希腊语语料库上进行训练。为了开发Meltemi 7B,我们通过在希腊语语料库上进行持续预训练来改编Mistral。Meltemi 7B包含截至2023年9月的最新信息。此外,我们翻译和整理了一个希腊语指令语料库,用于指令微调一个名为Meltemi 7B Instruct的聊天模型。我们在为Meltemi 7B Instruct进行对齐和去除有毒内容时特别注意。开发的模型在广泛收集的评估语料库上进行评估,并呈现了提示和回复的示例。Meltemi 7B和Meltemi 7B Instruct均可在https://huggingface.co/ilsp上以Apache 2.0许可证获得。
English
We describe the development and capabilities of Meltemi 7B, the first open Large Language Model for the Greek language. Meltemi 7B has 7 billion parameters and is trained on a 40 billion token Greek corpus. For the development of Meltemi 7B, we adapt Mistral, by continuous pretraining on the Greek Corpus. Meltemi 7B contains up-to-date information up to September 2023. Furthermore, we have translated and curated a Greek instruction corpus, which has been used for the instruction-tuning of a chat model, named Meltemi 7B Instruct. Special care has been given to the alignment and the removal of toxic content for the Meltemi 7B Instruct. The developed models are evaluated on a broad set of collected evaluation corpora, and examples of prompts and responses are presented. Both Meltemi 7B and Meltemi 7B Instruct are available at https://huggingface.co/ilsp under the Apache 2.0 license.

Summary

AI-Generated Summary

PDF704November 28, 2024