TinyMix-8x1b

This model is MoE consisting of 8 experts of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T

This model is untrained, and will likely perform worse than the dense version.

Will start training it very soon.

Idea by eastwind, who did it for the chat version of the model.

Downloads last month
3
Safetensors
Model size
6.43B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.