Kquant03 commited on
Commit
bfef005
1 Parent(s): 514b203

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -6,7 +6,7 @@ A frankenMoE of [TinyLlama-1.1B-1T-OpenOrca](https://huggingface.co/jeff31415/Ti
6
  [TinyLlama-1.1B-intermediate-step-1195k-token-2.5T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T),
7
  and [tiny-llama-1.1b-chat-medical](https://huggingface.co/SumayyaAli/tiny-llama-1.1b-chat-medical).
8
 
9
- ### Most 1.1B models are incoherent and can't even answer simple questions. I picked out some models that aren't as bad, then scripted for them to be split into different fields properly based on what their training data made them excel at. [The mergekit-moe script can be found here](https://drive.google.com/file/d/1JMdqQ1eTcOUAT6uuWVGgpEKenF2m3gwi/view?usp=drive_link). It is based on [Undi95's config.yaml for his MoE RP here](https://huggingface.co/Undi95/Mixtral-8x7B-MoE-RP-Story/blob/main/config.yaml)
10
 
11
  OpenOrca experts have been given the task of creating responses for simple questions about things like pop culture, history, and science...step-1195k experts have been chosen to provide warmth and a positive environment, while chat-medical experts have been chosen to provide further detail about human subjects, and to give small little bits of medical advice: I.E. "how do I get rid of this headache I gave myself from making you?"
12
 
 
6
  [TinyLlama-1.1B-intermediate-step-1195k-token-2.5T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T),
7
  and [tiny-llama-1.1b-chat-medical](https://huggingface.co/SumayyaAli/tiny-llama-1.1b-chat-medical).
8
 
9
+ ### Most 1.1B models are incoherent and can't even answer simple questions. I picked out some models that aren't as bad, then scripted for them to be split into different fields properly based on what their training data made them excel at. [The mergekit-moe script can be found here](https://drive.google.com/file/d/1JMdqQ1eTcOUAT6uuWVGgpEKenF2m3gwi/view?usp=drive_link). It is based on [Undi95's config.yaml for his Mixtral-8x7B-MoE-RP-Story here](https://huggingface.co/Undi95/Mixtral-8x7B-MoE-RP-Story/blob/main/config.yaml)
10
 
11
  OpenOrca experts have been given the task of creating responses for simple questions about things like pop culture, history, and science...step-1195k experts have been chosen to provide warmth and a positive environment, while chat-medical experts have been chosen to provide further detail about human subjects, and to give small little bits of medical advice: I.E. "how do I get rid of this headache I gave myself from making you?"
12