Cydonia-ForgottenThoughts-24B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using TheDrummer/Cydonia-24B-v2 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
# Base model
- model: TheDrummer/Cydonia-24B-v2
# Merge models
- model: ReadyArt/Forgotten-Safeword-24B-v3.0
parameters:
density:
- filter: self_attn
value: 0.75
- value: 0.5
weight:
- filter: self_attn
value: [0.5, 0.35, 0.5, 0.35, 0.5]
- filter: mlp
value: [0.25, 0.35, 0.25, 0.35, 0.25]
- value: 0.35
lambda:
- filter: self_attn
value: 1.05
- value: 0.9
- model: Undi95/MistralThinker-v1.1
parameters:
density:
- filter: mlp
value: 0.75
- value: 0.5
weight:
- filter: mlp
value: [0.5, 0.35, 0.5, 0.35, 0.5]
- filter: self_attn
value: [0.25, 0.35, 0.25, 0.35, 0.25]
- value: 0.35
lambda:
- filter: mlp
value: 1.05
- value: 0.9
merge_method: ties
base_model: TheDrummer/Cydonia-24B-v2
parameters:
normalize: true
dtype: float32
- Downloads last month
- 9
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Casual-Autopsy/Cydonia-ForgottenThoughts-24B
Merge model
this model