license: cc-by-nc-4.0 | |
tags: | |
- moe | |
# Mixtral MOE 2x7B | |
MOE the following models by mergekit and then fine tuned by DPO. | |
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) | |
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k) | |
* [jondurbin/bagel-dpo-7b-v0.1](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.1) |