Nile-Chat: Egyptian Language Models for Arabic and Latin Scripts
Abstract
Nile-Chat models, using Branch-Train-MiX strategy, outperform existing multilingual and Arabic LLMs on Egyptian dialect benchmarks in both Arabic and Latin scripts.
We introduce Nile-Chat-4B, 3x4B-A6B, and 12B, a collection of LLMs for Egyptian dialect, uniquely designed to understand and generate texts written in both Arabic and Latin scripts. Specifically, with Nile-Chat-3x4B-A6B, we introduce a novel language adaptation approach by leveraging the Branch-Train-MiX strategy to merge script-specialized experts, into a single MoE model. Our Nile-Chat models significantly outperform leading multilingual and Arabic LLMs, such as LLaMa, Jais, and ALLaM, on our newly introduced Egyptian evaluation benchmarks, which span both understanding and generative tasks. Notably, our 12B model yields a 14.4% performance gain over Qwen2.5-14B-Instruct on Latin-script benchmarks. All our resources are publicly available. We believe this work presents a comprehensive methodology for adapting LLMs to dual-script languages, addressing an often overlooked aspect in modern LLM development.
Community
Nile-Chat is a collection of LLMs for Egyptian dialect, uniquely designed to understand and generate texts written in both Arabic and Latin scripts.
In this work, we also introduce a novel language adaptation approach by leveraging the Branch-Train-MiX strategy to merge language-specialized experts, into a single MoE model.
Models citing this paper 4
Datasets citing this paper 0
No dataset linking this paper