|
--- |
|
base_model: ["Sao10K/Fimbulvetr-11B-v2"] |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
# UnFimbulvetr-20B-V2 |
|
|
|
 |
|
|
|
*Waifu to catch your attention* |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## GGUF Tests |
|
|
|
Seems usable. But I'm having difficulty telling if the stacked layers were beneficial. While I think there's something added, I really can't tell. |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the passthrough merge method. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
slices: |
|
- sources: |
|
- model: Sao10K/Fimbulvetr-11B-v2 |
|
layer_range: [0, 13] |
|
- sources: |
|
- model: Sao10K/Fimbulvetr-11B-v2 |
|
layer_range: [8, 13] |
|
- sources: |
|
- model: Sao10K/Fimbulvetr-11B-v2 |
|
layer_range: [12, 36] |
|
- sources: |
|
- model: Sao10K/Fimbulvetr-11B-v2 |
|
layer_range: [12, 36] |
|
- sources: |
|
- model: Sao10K/Fimbulvetr-11B-v2 |
|
layer_range: [20, 36] |
|
- sources: |
|
- model: Sao10K/Fimbulvetr-11B-v2 |
|
layer_range: [36, 48] |
|
merge_method: passthrough |
|
dtype: bfloat16 |
|
|
|
``` |
|
|