license: apache-2.0 | |
datasets: | |
- meta-math/MetaMathQA | |
language: | |
- en | |
pipeline_tag: text-generation | |
tags: | |
- Math | |
base_model: | |
- fblgit/una-cybertron-7b-v2-bf16 | |
- meta-math/MetaMath-Mistral-7B | |
## MetaMath-Cybertron | |
Merge [fblgit/una-cybertron-7b-v2-bf16](https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16) and [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B) using slerp merge. | |
You can use ChatML format. | |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) | |
Detailed results can be found [Coming soon]() | |
| Metric | Value | | |
|-----------------------|---------------------------| | |
| Avg. | Coming soon | | |
| ARC (25-shot) | Coming soon | | |
| HellaSwag (10-shot) | Coming soon | | |
| MMLU (5-shot) | Coming soon | | |
| TruthfulQA (0-shot) | Coming soon | | |
| Winogrande (5-shot) | Coming soon | | |
| GSM8K (5-shot) | Coming soon | | |