Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
itsdotscience
/
mistral-7b-v0.2-gguf
like
2
GGUF
Model card
Files
Files and versions
Community
Deploy
Use this model
main
mistral-7b-v0.2-gguf
Ctrl+K
Ctrl+K
1 contributor
History:
2 commits
itsdotscience
Upload folder using huggingface_hub
bbd0a59
verified
over 1 year ago
.gitattributes
Safe
2.46 kB
Upload folder using huggingface_hub
over 1 year ago
groups_merged.txt
Safe
201 kB
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-F16.gguf
Safe
14.7 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q2_K.gguf
Safe
3.14 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q3_K.gguf
Safe
3.94 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q3_K_L.gguf
Safe
4.24 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q3_K_S.gguf
Safe
3.58 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q4_0.gguf
Safe
4.54 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q4_1.gguf
Safe
4.97 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q4_K.gguf
Safe
4.79 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q4_K_S.gguf
Safe
4.56 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q5_0.gguf
Safe
5.43 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q5_1.gguf
Safe
5.86 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q5_K.gguf
Safe
5.55 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q5_K_S.gguf
Safe
5.41 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q6_K.gguf
Safe
6.36 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
mistral-7b-v0.2-Q8_0.gguf
Safe
8.08 GB
LFS
Upload folder using huggingface_hub
over 1 year ago