merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using Sorawiz/MistralCreative-24B-Chat as a base.
Models Merged
The following models were included in the merge:
- Gryphe/Pantheon-RP-1.8-24b-Small-3.1
- ReadyArt/Forgotten-Abomination-24B-v4.0
- ReadyArt/Forgotten-Transgression-24B-v4.1
Configuration
The following YAML configuration was used to produce this model:
merge_method: dare_ties
base_model: Sorawiz/MistralCreative-24B-Chat
models:
- model: Sorawiz/MistralCreative-24B-Chat
parameters:
weight: 0.20
- model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
weight: 0.20
- model: ReadyArt/Forgotten-Transgression-24B-v4.1
parameters:
weight: 0.30
- model: ReadyArt/Forgotten-Abomination-24B-v4.0
parameters:
weight: 0.30
parameters:
density: 1
tokenizer:
source: union
chat_template: auto
- Downloads last month
- 8
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Sorawiz/MistralCreative-24B-Test-E
Merge model
this model