DavidAU/Mistral-MOE-4X7B-Dark-MultiVerse-Uncensored-Enhanced32-24B-gguf Text Generation • 24B • Updated 26 days ago • 9.42k • 82
Language Learning - Moe - 12 GB GPU + 32 GB RAM Collection Force Model Experts Weights onto CPU • 2 items • Updated 3 days ago