![](https://huggingface.co/lodrick-the-lafted/Olethros-8B/resolve/main/olethros.png)
Olethros-8B-AWQ
L3-8b-Instruct tuned on roughly 6000 Opus generations in the hopes of adding a bit of sovl.
This is the 4 bit, group size 128 AWQ quant.
original model weights are here.
Quants
Exl2 and AWQ available right now.
Type | Misc | Author |
---|---|---|
GGUF | Static GGUF Quants | mradermacher |
AWQ | lodrick | |
exl2 | 2.25bpw | blockblockblock |
exl2 | 2.5bpw | blockblockblock |
exl2 | 3.0bpw | blockblockblock |
exl2 | 3.5bpw | blockblockblock |
exl2 | 3.7bpw | blockblockblock |
exl2 | 4.0bpw | blockblockblock |
exl2 | 4.2bpw | blockblockblock |
exl2 | 4.4bpw | blockblockblock |
exl2 | 4.6bpw | blockblockblock |
exl2 | 4.8bpw | blockblockblock |
exl2 | 5.0bpw | blockblockblock |
exl2 | 5.5bpw | blockblockblock |
exl2 | 6.0bpw | blockblockblock |
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.