EXL3 quants: [ H8-4.0BPW ]
Original model: WinterEngine-24B-Instruct by Darkknight535
- Downloads last month
- 22
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for DeathGodlike/WinterEngine-24B-Instruct_H8-4.0BPW_EXL3
Base model
Darkknight535/WinterEngine-24B-Instruct