Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ tags:
|
|
20 |
---
|
21 |
# Notes
|
22 |
|
23 |
-
For a model_stock merge, this has greatly exceeded my expectations. It beats Lamarck v0.7's average without introducing DeepSeek elements, mostly by scoring high on MATH without giving up much elsewhere. It also shows that the high-scoring Qwen2.5 14B merges are converging near the limits of the architecture.
|
24 |
|
25 |

|
26 |
|
|
|
20 |
---
|
21 |
# Notes
|
22 |
|
23 |
+
For a model_stock merge, this has greatly exceeded my expectations. It beats Lamarck v0.7's average without introducing DeepSeek elements, mostly by scoring high on MATH without giving up much elsewhere. It also shows that the high-scoring Qwen2.5 14B merges are converging near the limits of the architecture. Here is how it benchmarks alongside the models it merges.
|
24 |
|
25 |

|
26 |
|