Update README.md
Browse files
README.md
CHANGED
@@ -34,4 +34,8 @@ pipeline_tag: translation
|
|
34 |
|
35 |
## Summary
|
36 |
|
37 |
-
Plume is the first LLM trained for Neural Machine Translation with only parallel Catalan-Centric data from scratch. It is a language model with the same architecture as Gemma 2B. The model is trained for general translation tasks at sentence level. For more information about training, architecture and interpretability of the model check out the paper; "Investigating the translation capabilities of Large Language Models trained on parallel data only". The preprint is available on [arXiv]().
|
|
|
|
|
|
|
|
|
|
34 |
|
35 |
## Summary
|
36 |
|
37 |
+
Plume is the first LLM trained for Neural Machine Translation with only parallel Catalan-Centric data from scratch. It is a language model with the same architecture as Gemma 2B. The model is trained for general translation tasks at sentence level. For more information about training, architecture and interpretability of the model check out the paper; "Investigating the translation capabilities of Large Language Models trained on parallel data only". The preprint is available on [arXiv]().
|
38 |
+
|
39 |
+
- **Developed by:** Machine Translation Unit at the Barcelona Supercomputing Center (BSC).
|
40 |
+
- **Languages:** Spanish, French, Italian, Portuguese, Galician, German, English, and Basque.
|
41 |
+
- **License:** Apache License, Version 2.0
|