MagTie-v1-12B

This is a merge of pre-trained language models created using mergekit.

We used a pretrained base model as the base for a DARE-TIES merge, compensating by boosting the weights and densities in order to retain more training from the contributing models.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using grimjim/mistralai-Mistral-Nemo-Base-2407 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
models:
  - model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - model: inflatebot/MN-12B-Mag-Mell-R1
    parameters:
      weight: 0.85
      density: 0.75
  - model: Delta-Vector/Francois-Huali-12B
    parameters:
      weight: 0.85
      density: 0.75
  - model: grimjim/Magnolia-v3-12B
    parameters:
      weight: 0.85
      density: 0.75
merge_method: dare_ties
parameters:
  normalize: true
  int8_mask: true
dtype: bfloat16
Downloads last month
2
Safetensors
Model size
12.2B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for grimjim/MagTie-v1-12B