Deepseek-R1-0528-Qwen3-8b MLX w/ AWQ 4-bit
- quantization: 4 Bits
- num-samples: 32
- n-grid: 10
- Downloads last month
- 382
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for cashlion/deepseek-r1-0528-qwen3-8b-mlx-awq
Base model
deepseek-ai/DeepSeek-R1-0528-Qwen3-8B
Finetuned
unsloth/DeepSeek-R1-0528-Qwen3-8B