I am experimenting with some of the DELLA merge method parameters.

Progenitor-V2.3-70B

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DELLA merge method using nbeerbower/Llama-3.1-Nemotron-lorablated-70B as a base.

Models Merged

The following models were included in the merge:

EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1 Sao10K/L3.1-70B-Hanami-x1 Sao10K/70B-L3.3-Cirrus-x1 TheDrummer/Anubis-70B-v1 SicariusSicariiStuff/Negative_LLAMA_70B

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Sao10K/L3.1-70B-Hanami-x1
    parameters:
      weight: 0.20
  - model: Sao10K/70B-L3.3-Cirrus-x1
    parameters:
      weight: 0.20
  - model: SicariusSicariiStuff/Negative_LLAMA_70B
    parameters:
      weight: 0.20
  - model: TheDrummer/Anubis-70B-v1
    parameters:
      weight: 0.20
  - model: EVA-UNIT-01/EVA-LLaMA-3.33-70B-v0.1
    parameters:
      weight: 0.20
merge_method: della
base_model: nbeerbower/Llama-3.1-Nemotron-lorablated-70B
parameters:
  density: 0.7
  epsilon: 0.15
  lambda: 1.1
  rescale: 1
  window_size: 0.14
dtype: float32
out_dtype: bfloat16
tokenizer:
  source: union
Downloads last month
76
Safetensors
Model size
70.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for TareksLab/Progenitor-V2.3-LLaMa-70B

Spaces using TareksLab/Progenitor-V2.3-LLaMa-70B 3