Update README.md
Browse files
README.md
CHANGED
@@ -164,10 +164,10 @@ to implement production-ready inference pipelines.
|
|
164 |
|
165 |
**_Installation_**
|
166 |
|
167 |
-
Make sure you install [`vLLM
|
168 |
|
169 |
```
|
170 |
-
pip install vllm --
|
171 |
```
|
172 |
|
173 |
Doing so should automatically install [`mistral_common >= 1.5.4`](https://github.com/mistralai/mistral-common/releases/tag/v1.5.4).
|
@@ -177,7 +177,7 @@ To check:
|
|
177 |
python -c "import mistral_common; print(mistral_common.__version__)"
|
178 |
```
|
179 |
|
180 |
-
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39)
|
181 |
|
182 |
#### Server
|
183 |
|
|
|
164 |
|
165 |
**_Installation_**
|
166 |
|
167 |
+
Make sure you install [`vLLM >= 0.8.0`](https://github.com/vllm-project/vllm/releases/tag/v0.8.0):
|
168 |
|
169 |
```
|
170 |
+
pip install vllm --ugrade
|
171 |
```
|
172 |
|
173 |
Doing so should automatically install [`mistral_common >= 1.5.4`](https://github.com/mistralai/mistral-common/releases/tag/v1.5.4).
|
|
|
177 |
python -c "import mistral_common; print(mistral_common.__version__)"
|
178 |
```
|
179 |
|
180 |
+
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39).
|
181 |
|
182 |
#### Server
|
183 |
|