Gemma-2-9B-Instruct-4Bit-GPTQ
- Original Model: gemma-2-9b-it
- Model Creator: google
Quantization
- This model was quantized with the Auto-GPTQ library
Metrics
- Downloads last month
- 145
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support