EXL3 quantization of Jan-nano, 6 bits per weight.

HumanEval (argmax)

Model Q4 Q6 Q8 FP16
Jan-nano-exl3-4bpw 79.9 81.7 82.9 82.9
Jan-nano-exl3-6bpw 83.5 81.7 81.7 81.1
Jan-nano-exl3-8bpw-h8 84.8 82.9 83.5 82.9
Qwen3-4B-exl3-4bpw 80.5 81.1 81.7 80.5
Qwen3-4B-exl3-6bpw 80.5 85.4 86.0 86.0
Qwen3-4B-exl3-8bpw-h8 82.3 84.8 83.5 82.9
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for isogen/Jan-nano-exl3-6bpw

Base model

Qwen/Qwen3-4B-Base
Finetuned
Qwen/Qwen3-4B
Finetuned
Menlo/Jan-nano
Quantized
(28)
this model