L3-8B-Niitamamo-v0.1-della

L3-8B-Niitamamo-v0.1-della is a merge of the following models using mergekit:

🧩 Configuration

base_model: Sao10K/L3-8B-Niitama-v1
merge_method: della
models:
  - model: Sao10K/L3-8B-Niitama-v1
    parameters:
      weight: 1.0
  - model: Sao10K/L3-8B-Tamamo-v1
    parameters:
      weight: 1.0
dtype: bfloat16
Downloads last month
7
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for onion33/L3-8B-Niitamamo-v0.1-della

Merge model
this model
Quantizations
2 models