Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
masonym
/
gigabyte-faq-7B-q4_0-GGUF
like
0
GGUF
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gigabyte-faq-7B-q4_0-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
3 commits
masonym
Upload gigabyte-faq-7B-q4_0.gguf with huggingface_hub
ecbc255
over 1 year ago
.gitattributes
Safe
1.58 kB
Upload gigabyte-faq-7B-q4_0.gguf with huggingface_hub
over 1 year ago
gigabyte-faq-7B-q4_0.gguf
Safe
3.83 GB
LFS
Upload gigabyte-faq-7B-q4_0.gguf with huggingface_hub
over 1 year ago