yamatazen commited on
Commit
0f43862
·
verified ·
1 Parent(s): e5b2f1a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -27
README.md CHANGED
@@ -6,31 +6,34 @@ library_name: transformers
6
  tags:
7
  - mergekit
8
  - merge
9
-
 
 
10
  ---
11
- # HMS-Fusion-12B
12
-
13
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
-
15
- ## Merge Details
16
- ### Merge Method
17
-
18
- This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using [shisa-ai/shisa-v2-mistral-nemo-12b](https://huggingface.co/shisa-ai/shisa-v2-mistral-nemo-12b) as a base.
19
-
20
- ### Models Merged
21
-
22
- The following models were included in the merge:
23
- * [yamatazen/Himeyuri-Magnum-12B](https://huggingface.co/yamatazen/Himeyuri-Magnum-12B)
24
-
25
- ### Configuration
26
-
27
- The following YAML configuration was used to produce this model:
28
-
29
- ```yaml
30
- merge_method: arcee_fusion
31
- dtype: bfloat16
32
- out_dtype: bfloat16
33
- base_model: shisa-ai/shisa-v2-mistral-nemo-12b
34
- models:
35
- - model: yamatazen/Himeyuri-Magnum-12B
36
- ```
 
 
6
  tags:
7
  - mergekit
8
  - merge
9
+ language:
10
+ - en
11
+ - ja
12
  ---
13
+ ![image/png](https://huggingface.co/yamatazen/HMS-Fusion-12B/resolve/main/HMS-Fusion-12B.png?download=true)
14
+ # HMS-Fusion-12B
15
+
16
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
17
+
18
+ ## Merge Details
19
+ ### Merge Method
20
+
21
+ This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using [shisa-ai/shisa-v2-mistral-nemo-12b](https://huggingface.co/shisa-ai/shisa-v2-mistral-nemo-12b) as a base.
22
+
23
+ ### Models Merged
24
+
25
+ The following models were included in the merge:
26
+ * [yamatazen/Himeyuri-Magnum-12B](https://huggingface.co/yamatazen/Himeyuri-Magnum-12B)
27
+
28
+ ### Configuration
29
+
30
+ The following YAML configuration was used to produce this model:
31
+
32
+ ```yaml
33
+ merge_method: arcee_fusion
34
+ dtype: bfloat16
35
+ out_dtype: bfloat16
36
+ base_model: shisa-ai/shisa-v2-mistral-nemo-12b
37
+ models:
38
+ - model: yamatazen/Himeyuri-Magnum-12B
39
+ ```