Update README.md
Browse files
README.md
CHANGED
@@ -108,6 +108,14 @@ output_text = tokenizer.decode(outputs[0])
|
|
108 |
|
109 |
We recommend using the following set of parameters for inference. Note that our model does not have the default system_prompt.
|
110 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
111 |
```json
|
112 |
{
|
113 |
"top_k": 20,
|
|
|
108 |
|
109 |
We recommend using the following set of parameters for inference. Note that our model does not have the default system_prompt.
|
110 |
|
111 |
+
### Use with vLLM
|
112 |
+
```SHELL
|
113 |
+
pip install vllm --upgrade
|
114 |
+
```
|
115 |
+
|
116 |
+
```SHELL
|
117 |
+
vllm serve zhiqing/Hunyuan-MT-Chimera-7B-INT8
|
118 |
+
```
|
119 |
```json
|
120 |
{
|
121 |
"top_k": 20,
|