Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Mungert
/
gemma-3-12b-it-qat-q4_0-GGUF
like
2
Image-Text-to-Text
GGUF
gemma
gemma3
imatrix
conversational
arxiv:
2312.11805
License:
gemma
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
main
gemma-3-12b-it-qat-q4_0-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
19 commits
Mungert
Update README.md
a7599a4
verified
8 days ago
.gitattributes
2.66 kB
Upload gemma-3-12b-it-qat-q4_0.imatrix with huggingface_hub
8 days ago
README.md
19.7 kB
Update README.md
8 days ago
gemma-3-12b-it-qat-q4_0-iq1_m.gguf
3.61 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq1_m.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq1_s.gguf
3.45 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq1_s.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq2_m.gguf
4.86 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq2_m.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq2_s.gguf
4.65 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq2_s.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq2_xs.gguf
4.11 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq2_xs.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq2_xxs.gguf
3.88 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq2_xxs.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq3_m.gguf
5.66 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq3_m.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq3_s.gguf
5.46 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq3_s.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq3_xs.gguf
5.21 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq3_xs.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-iq3_xxs.gguf
4.92 GB
xet
Upload gemma-3-12b-it-qat-q4_0-iq3_xxs.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-q2_k_l.gguf
5.01 GB
xet
Upload gemma-3-12b-it-qat-q4_0-q2_k_l.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-q2_k_s.gguf
4.45 GB
xet
Upload gemma-3-12b-it-qat-q4_0-q2_k_s.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-q3_k_l.gguf
6.25 GB
xet
Upload gemma-3-12b-it-qat-q4_0-q3_k_l.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-q3_k_m.gguf
6.01 GB
xet
Upload gemma-3-12b-it-qat-q4_0-q3_k_m.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0-q3_k_s.gguf
5.46 GB
xet
Upload gemma-3-12b-it-qat-q4_0-q3_k_s.gguf with huggingface_hub
8 days ago
gemma-3-12b-it-qat-q4_0.imatrix
7.43 MB
xet
Upload gemma-3-12b-it-qat-q4_0.imatrix with huggingface_hub
8 days ago