merged_model
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [0, 16]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [8, 24]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [16, 32]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [24, 40]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [32, 48]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [40, 56]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [48, 64]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [56, 72]
- sources:
- model: meta-llama/Meta-Llama-3-70B-Instruct
layer_range: [64, 80]
merge_method: passthrough
dtype: float16
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for denru/Meta-Llama-3-70B-Instruct-x2
Base model
meta-llama/Meta-Llama-3-70B
Finetuned
meta-llama/Meta-Llama-3-70B-Instruct