merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: CultriX/SeQwence-14Bv1
  - model: allknowingroger/Qwenslerp2-14B
merge_method: slerp
base_model: CultriX/SeQwence-14Bv1
dtype: bfloat16
parameters:
  t: [0, 0.5, 1, 0.5, 0] # V shaped curve: Hermes for input & output, WizardMath in the middle layers

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 39.02
IFEval (0-Shot) 68.67
BBH (3-Shot) 47.59
MATH Lvl 5 (4-Shot) 34.14
GPQA (0-shot) 16.44
MuSR (0-shot) 18.32
MMLU-PRO (5-shot) 48.95
Downloads last month
93
Safetensors
Model size
14.8B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for allknowingroger/QwenSlerp6-14B

Merge model
this model
Merges
9 models
Quantizations
1 model

Space using allknowingroger/QwenSlerp6-14B 1

Evaluation results