Spacewars 24B v1.00b
Released by spacewars123
Quant by FrenzyBiscuit

AWQ Details
- Model was quantized down to INT4 using GEMM Kernels.
- Zero point quantization
- Group size of 64
- Downloads last month
- 30
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for FrenzyBiscuit/Space-Wars-24B-v1.00b-AWQ
Base model
spacewars123/Space-Wars-24B-v1.00b