Model trained on 800,000 Japanese sentences after reducing oshizo/japanese-e5-mistral-7b_slerp to 8 layers.
See this article for details(Japanese)
https://note.com/oshizo/n/n9140df790315

See intfloat/e5-mistral-7b-instruct page for model usage.

Downloads last month
2
Safetensors
Model size
1.88B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Datasets used to train oshizo/japanese-e5-mistral-1.9b

Collection including oshizo/japanese-e5-mistral-1.9b