GPTQ Q8 quantized version of DeepSeek R1 Distill 7B under Apache 2.0 license. Note, this version is not compatible with ExLlamaV2, but does work with V1 or Transformers. Enjoy
- Downloads last month
- 191
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for Akashium/DeepSeek-R1-Distill-Llama-8B-GPTQ
Base model
deepseek-ai/DeepSeek-R1-Distill-Llama-8B