Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
3
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
Community
1
main
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
Ctrl+K
Ctrl+K
1 contributor
History:
6 commits
DavidAU
Update README.md
2293fd6
verified
21 days ago
.gitattributes
Safe
1.52 kB
initial commit
3 months ago
README.md
4.16 kB
Update README.md
21 days ago