mihaimasala commited on
Commit
dc39fad
·
verified ·
1 Parent(s): f2bf454

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +106 -0
README.md ADDED
@@ -0,0 +1,106 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ language:
4
+ - ro
5
+ ---
6
+
7
+ # Model Card for Model ID
8
+
9
+ <!-- Provide a quick summary of what the model is/does. -->
10
+
11
+ RoGemma is a family of pretrained and fine-tuned generative text models for Romanian. This is the repository for the **instruct 7B model**. Links to other models can be found at the bottom of this page.
12
+
13
+ ## Model Details
14
+
15
+ ### Model Description
16
+
17
+ <!-- Provide a longer summary of what this model is. -->
18
+ OpenLLM-Ro represents the first open-source effort to build a LLM specialized for Romanian. OpenLLM-Ro developed and publicly releases a collection of Romanian LLMs, both in the form of foundational model and instruct and chat variants.
19
+
20
+
21
+ - **Developed by:** OpenLLM-Ro
22
+ <!-- - **Funded by [optional]:** [More Information Needed] -->
23
+ <!-- - **Shared by [optional]:** [More Information Needed] -->
24
+ <!-- - **Model type:** [More Information Needed] -->
25
+ - **Language(s):** Romanian
26
+ - **License:** cc-by-nc-4.0
27
+ - **Finetuned from model:** [gemma-7b](https://huggingface.co/google/gemma-7b)
28
+
29
+
30
+ ### Model Sources
31
+
32
+ <!-- Provide the basic links for the model. -->
33
+
34
+ - **Repository:** https://github.com/OpenLLM-Ro/llama-recipes
35
+ - **Paper:** https://arxiv.org/abs/2406.18266
36
+
37
+ ## Intended Use
38
+
39
+ ### Intended Use Cases
40
+
41
+ RoGemma is intented for research use in Romanian. Base models can be adapted for a variety of natural language tasks while instruction and chat tuned models are intended for assistant-like chat.
42
+
43
+ ### Out-of-Scope Use
44
+
45
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
46
+
47
+ Use in any manner that violates the license, any applicable laws or regluations, use in languages other than Romanian.
48
+
49
+
50
+
51
+ ## How to Get Started with the Model
52
+
53
+ Use the code below to get started with the model.
54
+
55
+ ```python
56
+ from transformers import AutoTokenizer, AutoModelForCausalLM
57
+
58
+ tokenizer = AutoTokenizer.from_pretrained("OpenLLM-Ro/RoGemma-7b-Instruct")
59
+ model = AutoModelForCausalLM.from_pretrained("OpenLLM-Ro/RoGemma-7b-Instruct")
60
+
61
+ instruction = "Ce jocuri de societate pot juca cu prietenii mei?"
62
+ chat = [
63
+ {"role": "user", "content": instruction},
64
+ ]
65
+ prompt = tokenizer.apply_chat_template(chat, tokenize=False, system_message="")
66
+
67
+ inputs = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt")
68
+ outputs = model.generate(input_ids=inputs, max_new_tokens=128)
69
+ print(tokenizer.decode(outputs[0]))
70
+ ```
71
+
72
+ ## Benchmarks
73
+
74
+ | Model | Average | ARC | MMLU |Winogrande|HellaSwag | GSM8k |TruthfulQA|
75
+ |--------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
76
+ | google/gemma-1.1-7b-it| 41.39 | 40.05 | 47.12 | 54.62 | 47.10 | 9.73 | 49.75 |
77
+ | *RoGemma-7b-Instruct* | ***53.65*** | ***52.77*** | ***54.69*** | ***69.10*** | ***61.97*** | ***31.97*** | ***51.43*** |
78
+
79
+
80
+ ## MT-Bench
81
+
82
+ | Model | Average | 1st turn | 2nd turn |
83
+ |--------------------|:--------:|:--------:|:--------:|
84
+ | google/gemma-1.1-7b-it | 4.63 | 5.18 | 4.08 |
85
+ | *RoMistral-7b-Instruct*| ***4.83***|***5.56***| ***4.10*** |
86
+
87
+
88
+ ## RoGemma Model Family
89
+
90
+ | Model | Link |
91
+ |--------------------|:--------:|
92
+ |*RoGemma-7b-Instruct*| [link](https://huggingface.co/OpenLLM-Ro/RoGemma-7b-Instruct) |
93
+
94
+
95
+ <!--
96
+ ## Citation
97
+
98
+ If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section.
99
+
100
+ **BibTeX:**
101
+
102
+ [More Information Needed]
103
+
104
+ **APA:**
105
+
106
+ [More Information Needed] -->