File size: 597 Bytes
1305551 7f3b678 1305551 7f3b678 a24b4ac 2d6d2ce 63718fd a24b4ac f67a3ac 87735e4 4d14c7a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
base_model: mistralai/Mixtral-8x22B-v0.1
tags:
- Mixtral
- instruct
- finetune
- chatml
- gpt4
- synthetic data
- distillation
language:
- en
license: apache-2.0
datasets:
- teknium/OpenHermes-2.5
---
# OpenHermes 2.5 - Mixtral 8x22B
Mixtral 8x22B full SFTed on OpenHermes 2.5 dataset (https://huggingface.co/datasets/teknium/OpenHermes-2.5).
Evaluations are still being ran. Download the model from branches 4th-epoch and 3rd-epoch.
Prompt format is ChatML. Refer to https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B for examples.
Research supported by Google's TPU Research Cloud. |