lbourdois commited on
Commit
dc43cf0
·
verified ·
1 Parent(s): 8fae54d

Improve language tag

Browse files

Hi! As the model is multilingual, this is a PR to add other languages than English to the language tag to improve the referencing. Note that 29 languages are announced in the README, but only 13 are explicitly listed. I was therefore only able to add these 13 languages.

Files changed (1) hide show
  1. README.md +44 -32
README.md CHANGED
@@ -1,32 +1,44 @@
1
- ---
2
- base_model: Qwen/Qwen2.5-3B-Instruct
3
- language:
4
- - en
5
- library_name: transformers
6
- license: other
7
- license_name: qwen-research
8
- license_link: https://huggingface.co/Qwen/Qwen2.5-3B-Instruct/blob/main/LICENSE
9
- pipeline_tag: text-generation
10
- tags:
11
- - chat
12
- - openvino
13
- - openvino-export
14
- ---
15
-
16
- This model was converted to OpenVINO from [`Qwen/Qwen2.5-3B-Instruct`](https://huggingface.co/Qwen/Qwen2.5-3B-Instruct) using [optimum-intel](https://github.com/huggingface/optimum-intel)
17
- via the [export](https://huggingface.co/spaces/echarlaix/openvino-export) space.
18
-
19
- First make sure you have optimum-intel installed:
20
-
21
- ```bash
22
- pip install optimum[openvino]
23
- ```
24
-
25
- To load your model you can do as follows:
26
-
27
- ```python
28
- from optimum.intel import OVModelForCausalLM
29
-
30
- model_id = "HelloSun/Qwen2.5-3B-Instruct-openvino"
31
- model = OVModelForCausalLM.from_pretrained(model_id)
32
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: Qwen/Qwen2.5-3B-Instruct
3
+ language:
4
+ - zho
5
+ - eng
6
+ - fra
7
+ - spa
8
+ - por
9
+ - deu
10
+ - ita
11
+ - rus
12
+ - jpn
13
+ - kor
14
+ - vie
15
+ - tha
16
+ - ara
17
+ library_name: transformers
18
+ license: other
19
+ license_name: qwen-research
20
+ license_link: https://huggingface.co/Qwen/Qwen2.5-3B-Instruct/blob/main/LICENSE
21
+ pipeline_tag: text-generation
22
+ tags:
23
+ - chat
24
+ - openvino
25
+ - openvino-export
26
+ ---
27
+
28
+ This model was converted to OpenVINO from [`Qwen/Qwen2.5-3B-Instruct`](https://huggingface.co/Qwen/Qwen2.5-3B-Instruct) using [optimum-intel](https://github.com/huggingface/optimum-intel)
29
+ via the [export](https://huggingface.co/spaces/echarlaix/openvino-export) space.
30
+
31
+ First make sure you have optimum-intel installed:
32
+
33
+ ```bash
34
+ pip install optimum[openvino]
35
+ ```
36
+
37
+ To load your model you can do as follows:
38
+
39
+ ```python
40
+ from optimum.intel import OVModelForCausalLM
41
+
42
+ model_id = "HelloSun/Qwen2.5-3B-Instruct-openvino"
43
+ model = OVModelForCausalLM.from_pretrained(model_id)
44
+ ```