Stellar Odyssey 12b v0.0
We will see... Come with me, take the journey~
Listen to the song on Youtube:
Soo... after I failed the first time, I took a crack at merging again. This time, these models were used
mistralai/Mistral-Nemo-Base-2407 Sao10K/MN-12B-Lyra-v4 nothingiisreal/MN-12B-Starcannon-v2 Gryphe/Pantheon-RP-1.5-12b-Nemo
License for this model is: cc-by-nc-4.0
I hope this was worth the time I spent to create this merge, lol
Gated access for now, gated access will be disabled when testing is done, and thanks to all who have interest.
Details
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the della_linear merge method using C:\Users\lg911\Downloads\Mergekit-Fixed\mergekit\mistralai_Mistral-Nemo-Base-2407 as a base.
Models Merged
The following models were included in the merge:
- C:\Users\Downloads\Mergekit-Fixed\mergekit\Sao10K_MN-12B-Lyra-v4
- C:\Users\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.5-12b-Nemo
- C:\Users\Downloads\Mergekit-Fixed\mergekit\nothingiisreal_MN-12B-Starcannon-v2
Configuration
The following YAML configuration was used to produce this model:
models:
- model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Sao10K_MN-12B-Lyra-v4
parameters:
weight: 0.3
density: 0.25
- model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\nothingiisreal_MN-12B-Starcannon-v2
parameters:
weight: 0.1
density: 0.4
- model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\Gryphe_Pantheon-RP-1.5-12b-Nemo
parameters:
weight: 0.4
density: 0.5
merge_method: della_linear
base_model: C:\Users\\Downloads\Mergekit-Fixed\mergekit\mistralai_Mistral-Nemo-Base-2407
parameters:
epsilon: 0.05
lambda: 1
merge_method: della_linear
dtype: bfloat16