Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
dahara1
/
gemma-3-27b-it-qat-japanese-imatrix
like
1
GGUF
Japanese
imatrix
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gemma-3-27b-it-qat-japanese-imatrix
Ctrl+K
Ctrl+K
1 contributor
History:
13 commits
dahara1
Update README.md
b60b1e0
verified
about 1 month ago
.gitattributes
Safe
2.59 kB
Upload 2 files
about 2 months ago
README.md
1.14 kB
Update README.md
about 1 month ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-IQ2_XS.gguf
Safe
8.44 GB
LFS
Upload 2 files
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-IQ3_XS.gguf
Safe
11.6 GB
LFS
Upload 2 files
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q2_K_L.gguf
10.8 GB
LFS
Upload 2 files
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q4_0.gguf
Safe
15.6 GB
LFS
Upload 2 files
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q4_K-f16.gguf
Safe
18.2 GB
LFS
Upload 2 files
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q4_K_L.gguf
16.9 GB
LFS
Upload 2 files
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q5_K-f16.gguf
Safe
20.9 GB
LFS
Upload gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q5_K-f16.gguf
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q5_K_L.gguf
19.6 GB
LFS
Upload gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q5_K_L.gguf
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q5_K_S.gguf
Safe
18.8 GB
LFS
Upload gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q5_K_S.gguf
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q6_K_L.gguf
Safe
22.5 GB
LFS
Upload gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q6_K_L.gguf
about 2 months ago
gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q8_0.gguf
Safe
30 GB
LFS
Upload gemma-3-27b-it-qat-q4_0-japanese-imatrix-Q8_0.gguf
about 2 months ago
imatrix.dat
13 MB
LFS
Upload 2 files
about 2 months ago
mmproj.gguf
Safe
858 MB
LFS
Upload 2 files
about 2 months ago