Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
gabriellarson
/
SmallThinker-4BA0.6B-Instruct-GGUF
like
0
Text Generation
GGUF
English
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
SmallThinker-4BA0.6B-Instruct-GGUF
50.6 GB
1 contributor
History:
8 commits
gabriellarson
Upload folder using huggingface_hub
32b1729
verified
2 months ago
.gitattributes
3 kB
Upload folder using huggingface_hub
2 months ago
README.md
Safe
5.37 kB
Create README.md
2 months ago
SmallThinker-32x758M-4BA0.6B-Instruct-F16.gguf
Safe
8.55 GB
xet
Upload SmallThinker-32x758M-4BA0.6B-Instruct-F16.gguf with huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-IQ2_M.gguf
1.52 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-IQ2_S.gguf
1.41 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-IQ2_XS.gguf
1.36 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-IQ2_XXS.gguf
1.24 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-IQ3_XS.gguf
1.85 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-IQ4_XS.gguf
Safe
2.35 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q2_K.gguf
1.66 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q2_K_S.gguf
1.57 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q3_K_L.gguf
Safe
2.27 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q3_K_M.gguf
Safe
2.12 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q4_0.gguf
Safe
2.48 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q4_K_M.gguf
Safe
2.63 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q4_K_S.gguf
Safe
2.49 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q5_0.gguf
Safe
2.99 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q5_K_M.gguf
Safe
3.06 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q5_K_S.gguf
Safe
2.98 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q6_K.gguf
Safe
3.51 GB
xet
Upload folder using huggingface_hub
2 months ago
SmallThinker-4BA0.6B-Instruct-Q8_0.gguf
Safe
4.55 GB
xet
Upload folder using huggingface_hub
2 months ago