A 3x3B Mixture of Expert (MoE) experimental model with 2 experts per token. Special thank MoV.
PS: Decent model, fast speed. Maybe it's silly, but fun enough if you use for message chat roleplay.
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for DoppelReflEx/MoETest-3E2A-3x3B
Merge model
this model