File size: 692 Bytes
f2dfabe
 
 
3a7c76d
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
license: apache-2.0
---

# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_SummerSigh__GPTNeo350M-Instruct-SFT)

| Metric                | Value                     |
|-----------------------|---------------------------|
| Avg.                  | 27.03   |
| ARC (25-shot)         | 25.94          |
| HellaSwag (10-shot)   | 38.55    |
| MMLU (5-shot)         | 25.76         |
| TruthfulQA (0-shot)   | 45.25   |
| Winogrande (5-shot)   | 50.2   |
| GSM8K (5-shot)        | 0.3        |
| DROP (3-shot)         | 3.24         |