Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
WesPro
/
State-of-the-MoE_RP-2x7B
like
1
Text Generation
Transformers
Safetensors
mixtral
text-generation-inference
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
No model card
Downloads last month
5
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Providers
NEW
Text Generation
This model is not currently available via any of the supported Inference Providers.
Model tree for
WesPro/State-of-the-MoE_RP-2x7B
Quantizations
3 models