Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,30 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
library_name: transformers
|
4 |
+
pipeline_tag: text-generation
|
5 |
+
---
|
6 |
+
|
7 |
+
This is the model is trained using paper, [M1: Towards Scalable Test-Time Compute with Mamba Reasoning Models](https://arxiv.org/abs/2504.10449).
|
8 |
+
|
9 |
+
|
10 |
+
| **Model** | **AIME 2025** | **AIME 2024** | **MATH 500** | **AMC 2023** | **OlympiadBench** |
|
11 |
+
|-----------------------------------|---------------|---------------|--------------|--------------|-------------------|
|
12 |
+
| Qwen2.5-Math-7B-Instruct (Transformer) | – | 13.3 | 79.8 | 50.6 | 40.7 |
|
13 |
+
| rStar-Math-7B (Transformer) | – | 26.7 | 78.4 | 47.5 | 47.1 |
|
14 |
+
| Eurus-2-7B-PRIME (Transformer) | – | 26.7 | 79.2 | 57.8 | 42.1 |
|
15 |
+
| Qwen2.5-7B-SimpleRL (Transformer) | – | 26.7 | 82.4 | 62.5 | 43.3 |
|
16 |
+
| DeepSeek-R1-Distill-Qwen-1.5B (Transformer) | 23.0 | 28.8 | 82.8 | 62.9 | 43.3 |
|
17 |
+
| **M1-3B (Mamba Hybrid Models)** | 23.5 | 28.5 | 84.0 | 62.8 | 47.3 |
|
18 |
+
|
19 |
+
|
20 |
+
|
21 |
+
Code: https://github.com/jxiw/M1
|
22 |
+
|
23 |
+
```
|
24 |
+
@article{wang2025m1scalabletesttimecompute,
|
25 |
+
title={M1: Towards Scalable Test-Time Compute with Mamba Reasoning Models},
|
26 |
+
author={Junxiong Wang and Wen-Ding Li and Daniele Paliotta and Daniel Ritter and Alexander M. Rush and Tri Dao},
|
27 |
+
journal={arXiv preprint arXiv:2504.10449},
|
28 |
+
year={2025},
|
29 |
+
url={https://arxiv.org/abs/2504.10449},
|
30 |
+
}
|