--- base_model: janhq/Jan-v1-4B --- [EXL3](https://github.com/turboderp-org/exllamav3) quantization of [Jan-v1-4B](https://huggingface.co/janhq/Jan-v1-4B), 4 bits per weight. ### HumanEval (argmax) | Model | Q4 | Q6 | Q8 | FP16 | | ------------------------------------------------------------------------------ | ---- | ---- | ---- | ---- | | [Jan-v1-4B-exl3-4bpw](https://huggingface.co/isogen/Jan-v1-4B-exl3-4bpw) | 82.3 | 79.3 | 78.0 | 78.0 | | [Jan-v1-4B-exl3-6bpw](https://huggingface.co/isogen/Jan-v1-4B-exl3-6bpw) | 78.0 | 76.8 | 77.4 | 76.8 | | [Jan-v1-4B-exl3-8bpw-h8](https://huggingface.co/isogen/Jan-v1-4B-exl3-8bpw-h8) | 79.9 | 78.7 | 78.0 | 77.4 |