Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Darkhn
/
L3.3-70B-Animus-V9.0A-9.1A-9.2A-GGUF
like
0
GGUF
Model card
Files
Files and versions
xet
Community
5
Deploy
Use this model
main
L3.3-70B-Animus-V9.0A-9.1A-9.2A-GGUF
/
V9.2A
Ctrl+K
Ctrl+K
2 contributors
History:
15 commits
Darkhn
Upload L3.3-70B-Animus-V9.2A-Q3_K_L.gguf (
#4
)
7b4c7e7
verified
8 days ago
L3.3-70B-Animus-V9.2A-Q2_K.gguf
Safe
26.4 GB
xet
Upload L3.3-70B-Animus-V9.2A-Q2_K.gguf (#3)
8 days ago
L3.3-70B-Animus-V9.2A-Q3_K_L.gguf
Safe
37.1 GB
xet
Upload L3.3-70B-Animus-V9.2A-Q3_K_L.gguf (#4)
8 days ago
L3.3-70B-Animus-V9.2A-Q3_K_M.gguf
Safe
34.3 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q3_K_M.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q3_K_M.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q3_K_S.gguf
Safe
30.9 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q3_K_S.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q3_K_S.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q4_0.gguf
Safe
40 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q4_0.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q4_0.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q4_K_M.gguf
Safe
42.5 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q4_K_M.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q4_K_M.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q4_K_S.gguf
Safe
40.3 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q4_K_S.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q4_K_S.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q5_0.gguf
Safe
48.7 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q5_0.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q5_0.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q5_K_M.gguf
Safe
49.9 GB
xet
Rename L3.3-70B-Animus-V9.2A-failed-Q5_K_M.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q5_K_M.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q5_K_S.gguf
Safe
48.7 GB
xet
Rename L3.3-70B-Animus-V9.2-Q5_K_S.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q5_K_S.gguf
9 days ago
L3.3-70B-Animus-V9.2A-Q6_K.gguf-00001-of-00002.gguf
Safe
44.9 GB
xet
Upload L3.3-70B-Animus-V9.2A-Q6_K.gguf-00001-of-00002.gguf (#1)
9 days ago
L3.3-70B-Animus-V9.2A-Q6_K.gguf-00002-of-00002.gguf
Safe
13 GB
xet
Rename L3.3-70B-Animus-V9.2-Q6_K.gguf-00002-of-00002.gguf to V9.2A/L3.3-70B-Animus-V9.2A-Q6_K.gguf-00002-of-00002.gguf
9 days ago