Update README.md
Browse files
README.md
CHANGED
@@ -154,7 +154,8 @@ gemma-2b-orpo performs well for its size on Nous' benchmark suite.
|
|
154 |
| [google/gemma-2b](https://huggingface.co/google/gemma-2b) [π](https://gist.github.com/mlabonne/7df1f238c515a5f63a750c8792cef59e) | 34.26 | 22.7 | 43.35 | 39.96 | 31.03 |
|
155 |
|
156 |
### [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
157 |
-
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_anakin87__gemma-2b-orpo)
|
|
|
158 |
|
159 |
| Metric |Value|
|
160 |
|---------------------------------|----:|
|
@@ -166,7 +167,6 @@ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-le
|
|
166 |
|Winogrande (5-shot) |64.33|
|
167 |
|GSM8k (5-shot) |13.87|
|
168 |
|
169 |
-
By comparison, on the Open LLM Leaderboard, google/gemma-2b-it has an average of 42.75.
|
170 |
|
171 |
## π Dataset
|
172 |
[`alvarobartt/dpo-mix-7k-simplified`](https://huggingface.co/datasets/alvarobartt/dpo-mix-7k-simplified)
|
|
|
154 |
| [google/gemma-2b](https://huggingface.co/google/gemma-2b) [π](https://gist.github.com/mlabonne/7df1f238c515a5f63a750c8792cef59e) | 34.26 | 22.7 | 43.35 | 39.96 | 31.03 |
|
155 |
|
156 |
### [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
157 |
+
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_anakin87__gemma-2b-orpo).
|
158 |
+
By comparison, on the Open LLM Leaderboard, google/gemma-2b-it has an average of 42.75.
|
159 |
|
160 |
| Metric |Value|
|
161 |
|---------------------------------|----:|
|
|
|
167 |
|Winogrande (5-shot) |64.33|
|
168 |
|GSM8k (5-shot) |13.87|
|
169 |
|
|
|
170 |
|
171 |
## π Dataset
|
172 |
[`alvarobartt/dpo-mix-7k-simplified`](https://huggingface.co/datasets/alvarobartt/dpo-mix-7k-simplified)
|