ybelkada commited on
Commit
f7c85cb
·
verified ·
1 Parent(s): 8c5ad10

docs: fix VLLM installation guideline

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -48,7 +48,11 @@ Make sure to install the latest version of `transformers` or `vllm`, eventually
48
  pip install git+https://github.com/huggingface/transformers.git
49
  ```
50
 
51
- Refer to [the official vLLM documentation for more details on building vLLM from source](https://docs.vllm.ai/en/latest/getting_started/installation/gpu.html#build-wheel-from-source).
 
 
 
 
52
 
53
  ### 🤗 transformers
54
 
@@ -74,7 +78,7 @@ model = AutoModelForCausalLM.from_pretrained(
74
  For vLLM, simply start a server by executing the command below:
75
 
76
  ```
77
- # pip install vllm
78
  vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
79
  ```
80
 
 
48
  pip install git+https://github.com/huggingface/transformers.git
49
  ```
50
 
51
+ For vLLM, make sure to install `vllm>=0.9.0`:
52
+
53
+ ```bash
54
+ pip install "vllm>=0.9.0"
55
+ ```
56
 
57
  ### 🤗 transformers
58
 
 
78
  For vLLM, simply start a server by executing the command below:
79
 
80
  ```
81
+ # pip install vllm>=0.9.0
82
  vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
83
  ```
84