Instruction tune of Yi-34b-200k with Airoboros-3.1 (fp16)
Overview
This is a 4.5bpw quantized version of bhenrym14/airoboros-3_1-yi-34b-200k,
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support