Update README.md
Browse files
README.md
CHANGED
@@ -45,7 +45,7 @@ We advise you to clone [`llama.cpp`](https://github.com/ggerganov/llama.cpp) and
|
|
45 |
huggingface-cli download xverse/XVERSE-7B-Chat-GGUF xverse-7b-chat-q4_k_m.gguf --local-dir . --local-dir-use-symlinks False
|
46 |
```
|
47 |
|
48 |
-
我们演示了如何使用 `llama.cpp` 来运行xverse-7b
|
49 |
|
50 |
```bash
|
51 |
./main -m xverse-7b-chat-q4_k_m.gguf -n 512 --color -i --temp 0.85 --top_k 30 --top_p 0.85 --repeat_penalty 1.1 -ins # -ngl 99 for GPU
|
@@ -59,7 +59,7 @@ Cloning the repo may be inefficient, and thus you can manually download the GGUF
|
|
59 |
huggingface-cli download xverse/XVERSE-7B-Chat-GGUF xverse-7b-chat-q4_k_m.gguf --local-dir . --local-dir-use-symlinks False
|
60 |
```
|
61 |
|
62 |
-
We demonstrate how to use `llama.cpp` to run xverse-7b
|
63 |
|
64 |
```shell
|
65 |
./main -m xverse-7b-chat-q4_k_m.gguf -n 512 --color -i --temp 0.85 --top_k 30 --top_p 0.85 --repeat_penalty 1.1 -ins # -ngl 99 for GPU
|
|
|
45 |
huggingface-cli download xverse/XVERSE-7B-Chat-GGUF xverse-7b-chat-q4_k_m.gguf --local-dir . --local-dir-use-symlinks False
|
46 |
```
|
47 |
|
48 |
+
我们演示了如何使用 `llama.cpp` 来运行xverse-7b-chat-q4_k_m.gguf模型:
|
49 |
|
50 |
```bash
|
51 |
./main -m xverse-7b-chat-q4_k_m.gguf -n 512 --color -i --temp 0.85 --top_k 30 --top_p 0.85 --repeat_penalty 1.1 -ins # -ngl 99 for GPU
|
|
|
59 |
huggingface-cli download xverse/XVERSE-7B-Chat-GGUF xverse-7b-chat-q4_k_m.gguf --local-dir . --local-dir-use-symlinks False
|
60 |
```
|
61 |
|
62 |
+
We demonstrate how to use `llama.cpp` to run xverse-7b-chat-q4_k_m.gguf model:
|
63 |
|
64 |
```shell
|
65 |
./main -m xverse-7b-chat-q4_k_m.gguf -n 512 --color -i --temp 0.85 --top_k 30 --top_p 0.85 --repeat_penalty 1.1 -ins # -ngl 99 for GPU
|