merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- layer_range: [0, 4]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [2, 4]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [4, 8]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [6, 8]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [8, 12]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [10, 12]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [12, 16]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [14, 16]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [16, 20]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [18, 20]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [20, 24]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [22, 24]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [24, 28]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [26, 28]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [28, 32]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [30, 32]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [32, 36]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [34, 36]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [36, 40]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [38, 40]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [40, 44]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [42, 44]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [44, 48]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [46, 48]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [48, 52]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [50, 52]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [52, 56]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [54, 56]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [56, 60]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [58, 60]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [60, 64]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [62, 64]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [64, 68]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [66, 68]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [68, 72]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [70, 72]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [72, 76]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [74, 76]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
- sources:
- layer_range: [76, 80]
model: Tarek07/Legion-V2.1-LLaMa-70B
- sources:
- layer_range: [78, 80]
model: Tarek07/Legion-V2.1-LLaMa-70B
parameters:
scale:
- filter: o_proj
value: 0
- filter: down_proj
value: 0
- value: 1
merge_method: passthrough
dtype: bfloat16
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for TareksLab/EXPERIMENTAL-UPSCALE-105B
Base model
Tarek07/Legion-V2.1-LLaMa-70B