Summoning EXL2 version ;)
This model sounds exciting! @LoneStriker or @Panchovix - would you consider making an EXL2 3bpw, possibly with rpcal?
I can try running this with Turbo's new quant method that does a better job at lower quants. It'll still take ages though, but not as long.
@LoneStriker Is the new quant method you're referring to QuIP? I hear pretty miraculous things about it for 2-bit quants compared to everything else.
@LoneStriker Is the new quant method you're referring to QuIP? I hear pretty miraculous things about it for 2-bit quants compared to everything else.
Improved exllamav2 quant method, should do much better at low bpw like we'll need for this model. exl2 is already comparable to QuIP in many ways and is much faster.
Thanks for clarifying! Had no idea that there is an improved exllamav2 quant method. Looking forward to testing out how 2-bit 70B models perform with the improvements.
Thanks for clarifying! Had no idea that there is an improved exllamav2 quant method. Looking forward to testing out how 2-bit 70B models perform with the improvements.
You can try this older airoboros 1.4.1 model re-quantized with the new method (still one of my favorite 70Bs):
https://huggingface.co/LoneStriker/airoboros-l2-70b-gpt4-1.4.1-2.4bpw-h6-exl2-2
Perplexity of the old vs. new exl2 quant is improved:
@LoneStriker That's a massive perplexity improvement. Do you (or @Panchovix ) plan to requantize other models that would benefit a lot from the newer method, e. g. Goliath 120B?
Anyway, thank you very much for LoneStriker/DiscoLM-120b-2.65bpw-h6-exl2-2! I'll test this model next.
I have plans to re-do Goliath with the new method, but I can't give an exact date when they will be up.
@Panchovix Thanks, I can wait, have a lot of other models to test still. Like this one, DiscoLM 120B. Let's see if it can dethrone my current favorite local model, Goliath...
I'm waiting for some new updates from turbo, since it seems more few commits are going to the experimental branch, that would help even more than now.