Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
Gemma-3-1b-it-MAX-NEO-Imatrix-GGUF
like
1
Text Generation
GGUF
gemma3
instruct
32k context
all use cases
maxed quants
Neo Imatrix
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
80450d6
Gemma-3-1b-it-MAX-NEO-Imatrix-GGUF
1 contributor
History:
9 commits
DavidAU
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_S-imat.gguf with huggingface_hub
80450d6
verified
10 days ago
.gitattributes
1.92 kB
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_S-imat.gguf with huggingface_hub
10 days ago
Gemma-3-1b-it-MAX-NEO-D_AU-IQ1_M-imat.gguf
927 MB
LFS
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ1_M-imat.gguf with huggingface_hub
10 days ago
Gemma-3-1b-it-MAX-NEO-D_AU-IQ1_S-imat.gguf
922 MB
LFS
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ1_S-imat.gguf with huggingface_hub
10 days ago
Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_S-imat.gguf
947 MB
LFS
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_S-imat.gguf with huggingface_hub
10 days ago
Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_XS-imat.gguf
940 MB
LFS
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_XS-imat.gguf with huggingface_hub
10 days ago
Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_XXS-imat.gguf
934 MB
LFS
Upload Gemma-3-1b-it-MAX-NEO-D_AU-IQ2_XXS-imat.gguf with huggingface_hub
10 days ago
README.md
280 Bytes
Update README.md
10 days ago