auto-patch README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,6 @@ quantized_by: mradermacher
|
|
14 |
<!-- ### vocab_type: -->
|
15 |
static quants of https://huggingface.co/jukofyork/Dark-Miqu-70B
|
16 |
|
17 |
-
|
18 |
<!-- provided-files -->
|
19 |
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
|
20 |
## Usage
|
@@ -31,7 +30,6 @@ more details, including on how to concatenate multi-part files.
|
|
31 |
|:-----|:-----|--------:|:------|
|
32 |
| [PART 1](https://huggingface.co/mradermacher/Dark-Miqu-70B-GGUF/resolve/main/Dark-Miqu-70B.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Dark-Miqu-70B-GGUF/resolve/main/Dark-Miqu-70B.Q8_0.gguf.part2of2) | Q8_0 | 73.4 | fast, best quality |
|
33 |
|
34 |
-
|
35 |
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
36 |
types (lower is better):
|
37 |
|
|
|
14 |
<!-- ### vocab_type: -->
|
15 |
static quants of https://huggingface.co/jukofyork/Dark-Miqu-70B
|
16 |
|
|
|
17 |
<!-- provided-files -->
|
18 |
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
|
19 |
## Usage
|
|
|
30 |
|:-----|:-----|--------:|:------|
|
31 |
| [PART 1](https://huggingface.co/mradermacher/Dark-Miqu-70B-GGUF/resolve/main/Dark-Miqu-70B.Q8_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Dark-Miqu-70B-GGUF/resolve/main/Dark-Miqu-70B.Q8_0.gguf.part2of2) | Q8_0 | 73.4 | fast, best quality |
|
32 |
|
|
|
33 |
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
34 |
types (lower is better):
|
35 |
|