Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
mergekit
Merge
chinese
arabic
english
multilingual
german
french
gagan3012/MetaModel
jeonsworld/CarbonVillain-en-10.7B-v2
jeonsworld/CarbonVillain-en-10.7B-v4
TomGrc/FusionNet_linear
DopeorNope/SOLARC-M-10.7B
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
upstage/SOLAR-10.7B-Instruct-v1.0
fblgit/UNA-SOLAR-10.7B-Instruct-v1.0
conversational
text-generation-inference
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -10,82 +10,57 @@ tags:
|
|
| 10 |
- multilingual
|
| 11 |
- german
|
| 12 |
- french
|
| 13 |
-
-
|
| 14 |
-
-
|
| 15 |
-
-
|
| 16 |
-
-
|
| 17 |
-
-
|
| 18 |
-
-
|
| 19 |
-
-
|
| 20 |
-
-
|
| 21 |
---
|
| 22 |
|
| 23 |
# MetaModel_moex8
|
| 24 |
|
| 25 |
This model is a Mixure of Experts (MoE) made with [mergekit](https://github.com/cg123/mergekit) (mixtral branch). It uses the following base models:
|
| 26 |
-
* [
|
| 27 |
-
* [
|
| 28 |
-
* [
|
| 29 |
-
* [
|
| 30 |
-
* [
|
| 31 |
-
* [
|
| 32 |
-
* [
|
| 33 |
-
* [
|
| 34 |
|
| 35 |
## 🧩 Configuration
|
| 36 |
|
| 37 |
-
```yamlbase_model:
|
| 38 |
dtype: bfloat16
|
| 39 |
experts:
|
| 40 |
- positive_prompts:
|
| 41 |
-
-
|
| 42 |
-
|
| 43 |
-
- tell me
|
| 44 |
-
- explain
|
| 45 |
-
source_model: openchat/openchat-3.5-1210
|
| 46 |
- positive_prompts:
|
| 47 |
-
-
|
| 48 |
-
-
|
| 49 |
-
- javascript
|
| 50 |
-
- programming
|
| 51 |
-
- algorithm
|
| 52 |
-
source_model: beowolx/CodeNinja-1.0-OpenChat-7B
|
| 53 |
- positive_prompts:
|
| 54 |
-
-
|
| 55 |
-
-
|
| 56 |
-
- scene
|
| 57 |
-
- story
|
| 58 |
-
- character
|
| 59 |
-
source_model: maywell/PiVoT-0.1-Starling-LM-RP
|
| 60 |
- positive_prompts:
|
| 61 |
-
-
|
| 62 |
-
|
| 63 |
-
- mathematics
|
| 64 |
-
- solve
|
| 65 |
-
- count
|
| 66 |
-
source_model: WizardLM/WizardMath-7B-V1.1
|
| 67 |
- positive_prompts:
|
| 68 |
-
-
|
| 69 |
-
-
|
| 70 |
-
- korea
|
| 71 |
-
source_model: davidkim205/komt-mistral-7b-v1
|
| 72 |
- positive_prompts:
|
| 73 |
-
-
|
| 74 |
-
-
|
| 75 |
-
- answer in chinese
|
| 76 |
-
source_model: OpenBuddy/openbuddy-zephyr-7b-v14.1
|
| 77 |
- positive_prompts:
|
| 78 |
-
-
|
| 79 |
-
-
|
| 80 |
-
- hindu
|
| 81 |
-
- answer in hindi
|
| 82 |
-
source_model: manishiitg/open-aditi-hi-v1
|
| 83 |
- positive_prompts:
|
| 84 |
-
-
|
| 85 |
-
-
|
| 86 |
-
- answer in german
|
| 87 |
-
- deutsch
|
| 88 |
-
source_model: VAGOsolutions/SauerkrautLM-7b-v1-mistral
|
| 89 |
gate_mode: hidden
|
| 90 |
```
|
| 91 |
|
|
|
|
| 10 |
- multilingual
|
| 11 |
- german
|
| 12 |
- french
|
| 13 |
+
- gagan3012/MetaModel
|
| 14 |
+
- jeonsworld/CarbonVillain-en-10.7B-v2
|
| 15 |
+
- jeonsworld/CarbonVillain-en-10.7B-v4
|
| 16 |
+
- TomGrc/FusionNet_linear
|
| 17 |
+
- DopeorNope/SOLARC-M-10.7B
|
| 18 |
+
- VAGOsolutions/SauerkrautLM-SOLAR-Instruct
|
| 19 |
+
- upstage/SOLAR-10.7B-Instruct-v1.0
|
| 20 |
+
- fblgit/UNA-SOLAR-10.7B-Instruct-v1.0
|
| 21 |
---
|
| 22 |
|
| 23 |
# MetaModel_moex8
|
| 24 |
|
| 25 |
This model is a Mixure of Experts (MoE) made with [mergekit](https://github.com/cg123/mergekit) (mixtral branch). It uses the following base models:
|
| 26 |
+
* [gagan3012/MetaModel](https://huggingface.co/gagan3012/MetaModel)
|
| 27 |
+
* [jeonsworld/CarbonVillain-en-10.7B-v2](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v2)
|
| 28 |
+
* [jeonsworld/CarbonVillain-en-10.7B-v4](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v4)
|
| 29 |
+
* [TomGrc/FusionNet_linear](https://huggingface.co/TomGrc/FusionNet_linear)
|
| 30 |
+
* [DopeorNope/SOLARC-M-10.7B](https://huggingface.co/DopeorNope/SOLARC-M-10.7B)
|
| 31 |
+
* [VAGOsolutions/SauerkrautLM-SOLAR-Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)
|
| 32 |
+
* [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0)
|
| 33 |
+
* [fblgit/UNA-SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/fblgit/UNA-SOLAR-10.7B-Instruct-v1.0)
|
| 34 |
|
| 35 |
## 🧩 Configuration
|
| 36 |
|
| 37 |
+
```yamlbase_model: jeonsworld/CarbonVillain-en-10.7B-v4
|
| 38 |
dtype: bfloat16
|
| 39 |
experts:
|
| 40 |
- positive_prompts:
|
| 41 |
+
- ''
|
| 42 |
+
source_model: gagan3012/MetaModel
|
|
|
|
|
|
|
|
|
|
| 43 |
- positive_prompts:
|
| 44 |
+
- ''
|
| 45 |
+
source_model: jeonsworld/CarbonVillain-en-10.7B-v2
|
|
|
|
|
|
|
|
|
|
|
|
|
| 46 |
- positive_prompts:
|
| 47 |
+
- ''
|
| 48 |
+
source_model: jeonsworld/CarbonVillain-en-10.7B-v4
|
|
|
|
|
|
|
|
|
|
|
|
|
| 49 |
- positive_prompts:
|
| 50 |
+
- ''
|
| 51 |
+
source_model: TomGrc/FusionNet_linear
|
|
|
|
|
|
|
|
|
|
|
|
|
| 52 |
- positive_prompts:
|
| 53 |
+
- ''
|
| 54 |
+
source_model: DopeorNope/SOLARC-M-10.7B
|
|
|
|
|
|
|
| 55 |
- positive_prompts:
|
| 56 |
+
- ''
|
| 57 |
+
source_model: VAGOsolutions/SauerkrautLM-SOLAR-Instruct
|
|
|
|
|
|
|
| 58 |
- positive_prompts:
|
| 59 |
+
- ''
|
| 60 |
+
source_model: upstage/SOLAR-10.7B-Instruct-v1.0
|
|
|
|
|
|
|
|
|
|
| 61 |
- positive_prompts:
|
| 62 |
+
- ''
|
| 63 |
+
source_model: fblgit/UNA-SOLAR-10.7B-Instruct-v1.0
|
|
|
|
|
|
|
|
|
|
| 64 |
gate_mode: hidden
|
| 65 |
```
|
| 66 |
|