docs: fix VLLM installation guideline
Browse files
README.md
CHANGED
@@ -48,7 +48,11 @@ Make sure to install the latest version of `transformers` or `vllm`, eventually
|
|
48 |
pip install git+https://github.com/huggingface/transformers.git
|
49 |
```
|
50 |
|
51 |
-
|
|
|
|
|
|
|
|
|
52 |
|
53 |
### 🤗 transformers
|
54 |
|
@@ -74,7 +78,7 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
74 |
For vLLM, simply start a server by executing the command below:
|
75 |
|
76 |
```
|
77 |
-
# pip install vllm
|
78 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
79 |
```
|
80 |
|
|
|
48 |
pip install git+https://github.com/huggingface/transformers.git
|
49 |
```
|
50 |
|
51 |
+
For vLLM, make sure to install `vllm>=0.9.0`:
|
52 |
+
|
53 |
+
```bash
|
54 |
+
pip install "vllm>=0.9.0"
|
55 |
+
```
|
56 |
|
57 |
### 🤗 transformers
|
58 |
|
|
|
78 |
For vLLM, simply start a server by executing the command below:
|
79 |
|
80 |
```
|
81 |
+
# pip install vllm>=0.9.0
|
82 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
83 |
```
|
84 |
|