fix(readme): Fixes 128k using results table from 4k.
Browse files
README.md
CHANGED
@@ -186,25 +186,25 @@ The number of k–shot examples is listed per-benchmark.
|
|
186 |
|
187 |
| | Phi-3-Mini-128K-In<br>3.8b | Phi-3-Small<br>7b (preview) | Phi-3-Medium<br>14b (preview) | Phi-2<br>2.7b | Mistral<br>7b | Gemma<br>7b | Llama-3-In<br>8b | Mixtral<br>8x7b | GPT-3.5<br>version 1106 |
|
188 |
|---|---|---|---|---|---|---|---|---|---|
|
189 |
-
| MMLU <br>5-Shot | 68.
|
190 |
-
| HellaSwag <br> 5-Shot |
|
191 |
| ANLI <br> 7-Shot | 52.8 | 55.0 | 58.7 | 42.5 | 47.1 | 48.7 | 54.8 | 55.2 | 58.1 |
|
192 |
-
| GSM-8K <br> 0-Shot; CoT |
|
193 |
-
| MedQA <br> 2-Shot |
|
194 |
-
| AGIEval <br> 0-Shot |
|
195 |
-
| TriviaQA <br> 5-Shot |
|
196 |
-
| Arc-C <br> 10-Shot | 84.
|
197 |
-
| Arc-E <br> 10-Shot |
|
198 |
-
| PIQA <br> 5-Shot |
|
199 |
-
| SociQA <br> 5-Shot | 76.
|
200 |
-
| BigBench-Hard <br> 0-Shot | 71.
|
201 |
-
| WinoGrande <br> 5-Shot |
|
202 |
-
| OpenBookQA <br> 10-Shot |
|
203 |
-
| BoolQ <br> 0-Shot |
|
204 |
-
| CommonSenseQA <br> 10-Shot |
|
205 |
-
| TruthfulQA <br> 10-Shot |
|
206 |
-
| HumanEval <br> 0-Shot |
|
207 |
-
| MBPP <br> 3-Shot |
|
208 |
|
209 |
## Software
|
210 |
|
|
|
186 |
|
187 |
| | Phi-3-Mini-128K-In<br>3.8b | Phi-3-Small<br>7b (preview) | Phi-3-Medium<br>14b (preview) | Phi-2<br>2.7b | Mistral<br>7b | Gemma<br>7b | Llama-3-In<br>8b | Mixtral<br>8x7b | GPT-3.5<br>version 1106 |
|
188 |
|---|---|---|---|---|---|---|---|---|---|
|
189 |
+
| MMLU <br>5-Shot | 68.1 | 75.3 | 78.2 | 56.3 | 61.7 | 63.6 | 66.0 | 68.4 | 71.4 |
|
190 |
+
| HellaSwag <br> 5-Shot | 74.5 | 78.7 | 83.2 | 53.6 | 58.5 | 49.8 | 69.5 | 70.4 | 78.8 |
|
191 |
| ANLI <br> 7-Shot | 52.8 | 55.0 | 58.7 | 42.5 | 47.1 | 48.7 | 54.8 | 55.2 | 58.1 |
|
192 |
+
| GSM-8K <br> 0-Shot; CoT | 83.6 | 86.4 | 90.8 | 61.1 | 46.4 | 59.8 | 77.4 | 64.7 | 78.1 |
|
193 |
+
| MedQA <br> 2-Shot | 55.3 | 58.2 | 69.8 | 40.9 | 49.6 | 50.0 | 58.9 | 62.2 | 63.4 |
|
194 |
+
| AGIEval <br> 0-Shot | 36.9 | 45.0 | 49.7 | 29.8 | 35.1 | 42.1 | 42.0 | 45.2 | 48.4 |
|
195 |
+
| TriviaQA <br> 5-Shot | 57.1 | 59.1 | 73.3 | 45.2 | 72.3 | 75.2 | 73.6 | 82.2 | 85.8 |
|
196 |
+
| Arc-C <br> 10-Shot | 84.0 | 90.7 | 91.9 | 75.9 | 78.6 | 78.3 | 80.5 | 87.3 | 87.4 |
|
197 |
+
| Arc-E <br> 10-Shot | 95.2 | 97.1 | 98.0 | 88.5 | 90.6 | 91.4 | 92.3 | 95.6 | 96.3 |
|
198 |
+
| PIQA <br> 5-Shot | 83.6 | 87.8 | 88.2 | 60.2 | 77.7 | 78.1 | 77.1 | 86.0 | 86.6 |
|
199 |
+
| SociQA <br> 5-Shot | 76.1 | 79.0 | 79.4 | 68.3 | 74.6 | 65.5 | 73.2 | 75.9 | 68.3 |
|
200 |
+
| BigBench-Hard <br> 0-Shot | 71.5 | 75.0 | 82.5 | 59.4 | 57.3 | 59.6 | 68.9 | 69.7 | 68.32 |
|
201 |
+
| WinoGrande <br> 5-Shot | 72.5 | 82.5 | 81.2 | 54.7 | 54.2 | 55.6 | 58.0 | 62.0 | 68.8 |
|
202 |
+
| OpenBookQA <br> 10-Shot | 80.6 | 88.4 | 86.6 | 73.6 | 79.8 | 78.6 | 81.6 | 85.8 | 86.0 |
|
203 |
+
| BoolQ <br> 0-Shot | 78.7 | 82.9 | 86.5 | -- | 72.2 | 66.0 | 78.3 | 77.6 | 79.1 |
|
204 |
+
| CommonSenseQA <br> 10-Shot | 78.0 | 80.3 | 82.6 | 69.3 | 72.6 | 76.2 | 73.6 | 78.1 | 79.6 |
|
205 |
+
| TruthfulQA <br> 10-Shot | 63.2 | 68.1 | 74.8 | -- | 52.1 | 53.0 | 62.0 | 60.1 | 85.8 |
|
206 |
+
| HumanEval <br> 0-Shot | 55.5 | 59.1 | 54.7 | 59.0 | 28.0 | 34.1 | 38.4 | 37.8 | 62.2 |
|
207 |
+
| MBPP <br> 3-Shot | 62.5 | 71.4 | 73.7 | 60.6 | 50.8 | 51.5 | 65.3 | 60.2 | 77.8 |
|
208 |
|
209 |
## Software
|
210 |
|