Update README.md
Browse files
README.md
CHANGED
@@ -48,8 +48,11 @@ Make sure to install the latest version of `transformers` or `vllm`, eventually
|
|
48 |
pip install git+https://github.com/huggingface/transformers.git
|
49 |
```
|
50 |
|
51 |
-
|
52 |
|
|
|
|
|
|
|
53 |
### 🤗 transformers
|
54 |
|
55 |
Refer to the snippet below to run H1 models using 🤗 transformers:
|
@@ -74,7 +77,7 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
74 |
For vLLM, simply start a server by executing the command below:
|
75 |
|
76 |
```
|
77 |
-
# pip install vllm
|
78 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
79 |
```
|
80 |
|
|
|
48 |
pip install git+https://github.com/huggingface/transformers.git
|
49 |
```
|
50 |
|
51 |
+
For vLLM, make sure to install `vllm>=0.9.0`:
|
52 |
|
53 |
+
```bash
|
54 |
+
pip install "vllm>=0.9.0"
|
55 |
+
```
|
56 |
### 🤗 transformers
|
57 |
|
58 |
Refer to the snippet below to run H1 models using 🤗 transformers:
|
|
|
77 |
For vLLM, simply start a server by executing the command below:
|
78 |
|
79 |
```
|
80 |
+
# pip install vllm>=0.9.0
|
81 |
vllm serve tiiuae/Falcon-H1-1B-Instruct --tensor-parallel-size 2 --data-parallel-size 1
|
82 |
```
|
83 |
|