merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices: 

  - sources: 

      - model:  lunahr/SystemGemma2-27b-it

        layer_range: [0, 16] 

      - model: Saxo/Linkbricks-Horizon-AI-Korean-Superb-27B

        layer_range: [0, 16] 

merge_method: slerp 

base_model: lunahr/SystemGemma2-27b-it

parameters: 

  t: 

    - filter: self_attn 

      value: [0, 0.5, 0.3, 0.7, 1] 

    - filter: mlp 

      value: [1, 0.5, 0.7, 0.3, 0] 

    - value: 0.5 

dtype: bfloat16 
Downloads last month
9
Safetensors
Model size
10.2B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for kfawel/pandipirraloda