notevo_merge_biomistral_e57b_slerp

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

  • ./e5-mistral-7b-instruct_with_lm_head
  • ./local_BioMistral-7B

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: ./e5-mistral-7b-instruct_with_lm_head
        layer_range: [0, 32]
      - model: ./local_BioMistral-7B
        layer_range: [0, 32]
merge_method: slerp
base_model: ./e5-mistral-7b-instruct_with_lm_head
parameters:
  t:
    - value: [0.5, 0.1]

merge_method: slerp
dtype: float16
Downloads last month
16
Safetensors
Model size
7.24B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support