metadata
base_model:
- Qwen/Qwen3-30B-A3B-Base
- allura-forge/q3-30b-ft-ep2-merged
- Qwen/Qwen3-30B-A3B
- Gryphe/Pantheon-Proto-RP-1.8-30B-A3B
library_name: transformers
tags:
- mergekit
- merge
Please see Pentiment for the final result of this merge
output
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SCE merge method using Qwen/Qwen3-30B-A3B-Base as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model: Qwen/Qwen3-30B-A3B-Base
models:
- model: allura-forge/q3-30b-ft-ep2-merged
parameters:
select_topk: 0.75
- model: Gryphe/Pantheon-Proto-RP-1.8-30B-A3B
parameters:
select_topk: 0.4
- model: Qwen/Qwen3-30B-A3B
parameters:
select_topk: 0.25
merge_method: sce
dtype: bfloat16