Update README.md
Browse files
README.md
CHANGED
@@ -39,7 +39,7 @@ pipe = pipeline(
|
|
39 |
)
|
40 |
|
41 |
messages = [
|
42 |
-
{"role": "user", "content": '
|
43 |
]
|
44 |
|
45 |
outputs = pipe(messages, max_new_tokens=128, temperature=0.0)
|
@@ -47,7 +47,7 @@ assistant_response = outputs[0]["generated_text"][-1]["content"].strip()
|
|
47 |
print(assistant_response)
|
48 |
```
|
49 |
```
|
50 |
-
- Response
|
51 |
```
|
52 |
#### Running the model on a single / multi GPU
|
53 |
```python
|
@@ -63,7 +63,7 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
63 |
tokenizer = AutoTokenizer.from_pretrained(model_id)
|
64 |
|
65 |
messages = [
|
66 |
-
{"role": "user", "content": "
|
67 |
]
|
68 |
|
69 |
input_ids = tokenizer.apply_chat_template(messages, return_tensors="pt", return_dict=True , add_generation_prompt=True).to(model.device)
|
@@ -73,7 +73,7 @@ outputs = model.generate(**input_ids, max_new_tokens=128)
|
|
73 |
print(tokenizer.decode(outputs[0]))
|
74 |
```
|
75 |
```
|
76 |
-
- Response
|
77 |
```
|
78 |
## Citations
|
79 |
When using this model ** Barcha-7B-Instruct **, please cite:
|
@@ -81,7 +81,7 @@ When using this model ** Barcha-7B-Instruct **, please cite:
|
|
81 |
```bibtex
|
82 |
@model{linagora2025LLM-tn,
|
83 |
author = {Wajdi Ghezaiel and Jean-Pierre Lorré},
|
84 |
-
title = {Barcha-7B-Instruct :Tunisian Arabic Derja LLM},
|
85 |
year = {2025},
|
86 |
month = {July},
|
87 |
url = {https://huggingface.co/datasets/linagora/Barcha-7B-Instruct}
|
|
|
39 |
)
|
40 |
|
41 |
messages = [
|
42 |
+
{"role": "user", "content": ' شنو معنى برشا'},
|
43 |
]
|
44 |
|
45 |
outputs = pipe(messages, max_new_tokens=128, temperature=0.0)
|
|
|
47 |
print(assistant_response)
|
48 |
```
|
49 |
```
|
50 |
+
- Response:برشّا هي كلمة تعني كتر من واحد حاجة
|
51 |
```
|
52 |
#### Running the model on a single / multi GPU
|
53 |
```python
|
|
|
63 |
tokenizer = AutoTokenizer.from_pretrained(model_id)
|
64 |
|
65 |
messages = [
|
66 |
+
{"role": "user", "content": "شنو معنى لاباس""},
|
67 |
]
|
68 |
|
69 |
input_ids = tokenizer.apply_chat_template(messages, return_tensors="pt", return_dict=True , add_generation_prompt=True).to(model.device)
|
|
|
73 |
print(tokenizer.decode(outputs[0]))
|
74 |
```
|
75 |
```
|
76 |
+
- Response:لاباس هو كلمة جاية من العربية، معناها هل أنت بخير
|
77 |
```
|
78 |
## Citations
|
79 |
When using this model ** Barcha-7B-Instruct **, please cite:
|
|
|
81 |
```bibtex
|
82 |
@model{linagora2025LLM-tn,
|
83 |
author = {Wajdi Ghezaiel and Jean-Pierre Lorré},
|
84 |
+
title = {Barcha-7B-Instruct :Tunisian Arabic Derja LLM based on Qwen2-7B},
|
85 |
year = {2025},
|
86 |
month = {July},
|
87 |
url = {https://huggingface.co/datasets/linagora/Barcha-7B-Instruct}
|