wandermay commited on
Commit
f2b0477
·
verified ·
1 Parent(s): 8210297

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -5,7 +5,7 @@ license: apache-2.0
5
 
6
 
7
  ### Model description
8
- NTele-R1-32B-V1 is the continuation of [NTele-R1-32B-Previce](https://huggingface.co/ZTE-AIM/NTele-R1-32B-Preview), you can visit for more information. We have made great improvements on the base by using less corpus **in mathematics and code (only 800 items, including 400 mathematics and 400 codes)**, and surpassed the industry's advanced models **Qwen3-32B and QwQ-32B**.
9
  | Model | Trained From | Release Date | AIME2024 | AIME2025 | MATH500 | GPQA-Diamond | LCB(24.08-25.02) |
10
  |-------|-------|-------|-------|-------|-------|-------|-------|
11
  | DeepSeek-R1-Distill-Qwen-32B | Qwen2.5-32B-Instruct | 25.1.20 | 64.17 | 55.21 | 89.8 | 62.1 | 50.26 |
@@ -18,7 +18,7 @@ NTele-R1-32B-V1 is the continuation of [NTele-R1-32B-Previce](https://huggingfac
18
 
19
  [\[🤗 Codemath400\]](https://huggingface.co/datasets/ZTE-AIM/NTele-R1-Data)
20
 
21
- You can access our [dataset](https://huggingface.co/datasets/ZTE-AIM/NTele-R1-Data) to get 800 training data and visit the [NTele-R1-32B-Previce](https://huggingface.co/ZTE-AIM/NTele-R1-32B-Preview) to learn about the data synthesis and screening process.
22
 
23
 
24
 
 
5
 
6
 
7
  ### Model description
8
+ NTele-R1-32B-V1 is the continuation of [NTele-R1-32B-Preview](https://huggingface.co/ZTE-AIM/NTele-R1-32B-Preview), you can visit for more information. We have made great improvements on the base by using less corpus **in mathematics and code (only 800 items, including 400 mathematics and 400 codes)**, and surpassed the industry's advanced models **Qwen3-32B and QwQ-32B**.
9
  | Model | Trained From | Release Date | AIME2024 | AIME2025 | MATH500 | GPQA-Diamond | LCB(24.08-25.02) |
10
  |-------|-------|-------|-------|-------|-------|-------|-------|
11
  | DeepSeek-R1-Distill-Qwen-32B | Qwen2.5-32B-Instruct | 25.1.20 | 64.17 | 55.21 | 89.8 | 62.1 | 50.26 |
 
18
 
19
  [\[🤗 Codemath400\]](https://huggingface.co/datasets/ZTE-AIM/NTele-R1-Data)
20
 
21
+ You can access our [dataset](https://huggingface.co/datasets/ZTE-AIM/NTele-R1-Data) to get 800 training data and visit the [NTele-R1-32B-Preview](https://huggingface.co/ZTE-AIM/NTele-R1-32B-Preview) to learn about the data synthesis and screening process.
22
 
23
 
24