Metrics
PPL | arc_easy | arc_challenge | piqa | winogrande | hellaswag | mmlu | QA Avg |
---|---|---|---|---|---|---|---|
4082.93 | 26.26 ± 0.90 | 22.44 ± 1.22 | 52.45 ± 1.17 | 51.78 ± 1.40 | 25.88 ± 0.44 | - | 35.76 |
Training method based on BitDistiller Paper
- License: mit
- Finetuned from: TinyLlama/TinyLlama_v1.1
- Downloads last month
- 20
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for Heisenger/Llama-2-7b-hf_1bit_int
Base model
TinyLlama/TinyLlama_v1.1