EXL3 quantization of Lucy, 6 bits per weight.

HumanEval (argmax)

Model Q4 Q6 Q8 FP16
Lucy-exl3-4bpw 0.0 64.0 63.4 64.0
Lucy-exl3-6bpw 0.0 69.5 69.5 69.5
Lucy-exl3-8bpw-h8 0.0 68.9 68.9 69.5
Qwen3-1.7B-exl3-8bpw-h8 0.0 70.7 68.3 68.9
Qwen3-1.7B-Base-exl3-8bpw-h8 0.0 66.5 70.7 70.1
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for isogen/Lucy-exl3-6bpw

Finetuned
Qwen/Qwen3-1.7B
Finetuned
Menlo/Lucy
Quantized
(14)
this model