Update README.md
Browse files
README.md
CHANGED
|
@@ -23,12 +23,15 @@ $$
|
|
| 23 |
|
| 24 |
### Why Though?
|
| 25 |
unfortunately this is not as simple as [typeof/zephyr-7b-beta-lora](https://huggingface.co/typeof/zephyr-7b-beta-lora)
|
| 26 |
-
due to the way in which [OpenHermes](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) was trained...
|
| 27 |
by adding tokens, the corresponance is not 1-to-1 with [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
|
| 28 |
as is the case with [typeof/zephyr-7b-beta-lora](https://huggingface.co/typeof/zephyr-7b-beta-lora) ...
|
| 29 |
nevertheless, if you have found yourself here, I'm sure you can figure out how to use it... if not, open up an issue!
|
| 30 |
|
| 31 |
|
|
|
|
|
|
|
|
|
|
| 32 |
<!--
|
| 33 |
$$ W_{mistral} + LoRA_{zephyr} = W_{zephyr} $$
|
| 34 |
```
|
|
|
|
| 23 |
|
| 24 |
### Why Though?
|
| 25 |
unfortunately this is not as simple as [typeof/zephyr-7b-beta-lora](https://huggingface.co/typeof/zephyr-7b-beta-lora)
|
| 26 |
+
due to the way in which [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) was trained...
|
| 27 |
by adding tokens, the corresponance is not 1-to-1 with [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
|
| 28 |
as is the case with [typeof/zephyr-7b-beta-lora](https://huggingface.co/typeof/zephyr-7b-beta-lora) ...
|
| 29 |
nevertheless, if you have found yourself here, I'm sure you can figure out how to use it... if not, open up an issue!
|
| 30 |
|
| 31 |
|
| 32 |
+

|
| 33 |
+
photo courtesy @teknium [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) was trained...
|
| 34 |
+
|
| 35 |
<!--
|
| 36 |
$$ W_{mistral} + LoRA_{zephyr} = W_{zephyr} $$
|
| 37 |
```
|