Update README.md
Browse files
README.md
CHANGED
|
@@ -6,8 +6,21 @@ library_name: transformers
|
|
| 6 |
tags:
|
| 7 |
- mergekit
|
| 8 |
- merge
|
| 9 |
-
|
|
|
|
|
|
|
| 10 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
# merge
|
| 12 |
|
| 13 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
|
@@ -55,4 +68,4 @@ parameters:
|
|
| 55 |
- value: 0.5 # fallback for rest of tensors
|
| 56 |
dtype: float16
|
| 57 |
|
| 58 |
-
```
|
|
|
|
| 6 |
tags:
|
| 7 |
- mergekit
|
| 8 |
- merge
|
| 9 |
+
license: mit
|
| 10 |
+
language:
|
| 11 |
+
- en
|
| 12 |
---
|
| 13 |
+
|
| 14 |
+
hopefully this merge took correctly ! ....
|
| 15 |
+
|
| 16 |
+
Enabling for thights to be displayed ;
|
| 17 |
+
|
| 18 |
+
obviously untrained and will still need fine tuning !
|
| 19 |
+
as well as it has not been correctly coded for true management via transformers pretrained args.
|
| 20 |
+
i will try to add the other arch: leaving it available to perhaps load with different remote auto mapping! ,
|
| 21 |
+
I will leve both automapping here and test both models to see which configuration loads correctly for training ! then wich loads correctly for usage ; as this also has been a minor issue ;
|
| 22 |
+
the internall heads have default settings ; with remote code installed then its should be configuarble.
|
| 23 |
+
|
| 24 |
# merge
|
| 25 |
|
| 26 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
|
|
|
| 68 |
- value: 0.5 # fallback for rest of tensors
|
| 69 |
dtype: float16
|
| 70 |
|
| 71 |
+
```
|