Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bartowski
/
Sao10K_Llama-3.3-70B-Vulpecula-r1-GGUF
like
1
Text Generation
GGUF
English
imatrix
conversational
License:
llama3.3
Model card
Files
Files and versions
Community
Deploy
Use this model
5f9f389
Sao10K_Llama-3.3-70B-Vulpecula-r1-GGUF
1 contributor
History:
15 commits
bartowski
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ3_M.gguf with huggingface_hub
5f9f389
verified
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q5_K_M
Upload folder using huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q6_K
Upload folder using huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q8_0
Upload folder using huggingface_hub
8 days ago
.gitattributes
3.23 kB
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ3_M.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ3_M.gguf
31.9 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ3_M.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ4_NL.gguf
40.1 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ4_NL.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ4_XS.gguf
37.9 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-IQ4_XS.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q3_K_L.gguf
37.1 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q3_K_L.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q3_K_M.gguf
34.3 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q3_K_M.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q3_K_XL.gguf
38.1 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q3_K_XL.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_0.gguf
40.1 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_0.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_1.gguf
44.3 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_1.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_K_M.gguf
42.5 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_K_M.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_K_S.gguf
40.3 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q4_K_S.gguf with huggingface_hub
8 days ago
Sao10K_Llama-3.3-70B-Vulpecula-r1-Q5_K_S.gguf
48.7 GB
LFS
Upload Sao10K_Llama-3.3-70B-Vulpecula-r1-Q5_K_S.gguf with huggingface_hub
8 days ago