Update README.md
Browse files
README.md
CHANGED
@@ -70,10 +70,13 @@ For a production-ready instruction model please use [Mistral-Small-3.1-24B-Instr
|
|
70 |
|
71 |
**_Installation_**
|
72 |
|
73 |
-
|
|
|
|
|
|
|
74 |
|
75 |
```
|
76 |
-
pip install vllm --
|
77 |
```
|
78 |
|
79 |
Doing so should automatically install [`mistral_common >= 1.5.4`](https://github.com/mistralai/mistral-common/releases/tag/v1.5.4).
|
@@ -83,7 +86,7 @@ To check:
|
|
83 |
python -c "import mistral_common; print(mistral_common.__version__)"
|
84 |
```
|
85 |
|
86 |
-
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39)
|
87 |
|
88 |
**_Example_**
|
89 |
|
|
|
70 |
|
71 |
**_Installation_**
|
72 |
|
73 |
+
We recommend using this model with the [vLLM library](https://github.com/vllm-project/vllm)
|
74 |
+
to implement production-ready inference pipelines.
|
75 |
+
|
76 |
+
Make sure you install [`vLLM >= 0.8.0`](https://github.com/vllm-project/vllm/releases/tag/v0.8.0):
|
77 |
|
78 |
```
|
79 |
+
pip install vllm --ugrade
|
80 |
```
|
81 |
|
82 |
Doing so should automatically install [`mistral_common >= 1.5.4`](https://github.com/mistralai/mistral-common/releases/tag/v1.5.4).
|
|
|
86 |
python -c "import mistral_common; print(mistral_common.__version__)"
|
87 |
```
|
88 |
|
89 |
+
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39).
|
90 |
|
91 |
**_Example_**
|
92 |
|