akhauriyash commited on
Commit
d597d1e
·
verified ·
1 Parent(s): 5896a5c

End of training

Browse files
Files changed (2) hide show
  1. README.md +3 -1
  2. config.json +1 -1
README.md CHANGED
@@ -1,9 +1,11 @@
1
  ---
2
  base_model: akhauriyash/DeepSeek-R1-Distill-Qwen-1.5B-SpeculativeReasoner
 
3
  library_name: transformers
4
  model_name: DeepSeek-R1-Distill-Qwen-1.5B-GRPO-SpeculativeReasoner
5
  tags:
6
  - generated_from_trainer
 
7
  - trl
8
  - grpo
9
  licence: license
@@ -11,7 +13,7 @@ licence: license
11
 
12
  # Model Card for DeepSeek-R1-Distill-Qwen-1.5B-GRPO-SpeculativeReasoner
13
 
14
- This model is a fine-tuned version of [akhauriyash/DeepSeek-R1-Distill-Qwen-1.5B-SpeculativeReasoner](https://huggingface.co/akhauriyash/DeepSeek-R1-Distill-Qwen-1.5B-SpeculativeReasoner).
15
  It has been trained using [TRL](https://github.com/huggingface/trl).
16
 
17
  ## Quick start
 
1
  ---
2
  base_model: akhauriyash/DeepSeek-R1-Distill-Qwen-1.5B-SpeculativeReasoner
3
+ datasets: akhauriyash/OpenR1_Math_SpecR_GRPO
4
  library_name: transformers
5
  model_name: DeepSeek-R1-Distill-Qwen-1.5B-GRPO-SpeculativeReasoner
6
  tags:
7
  - generated_from_trainer
8
+ - open-r1
9
  - trl
10
  - grpo
11
  licence: license
 
13
 
14
  # Model Card for DeepSeek-R1-Distill-Qwen-1.5B-GRPO-SpeculativeReasoner
15
 
16
+ This model is a fine-tuned version of [akhauriyash/DeepSeek-R1-Distill-Qwen-1.5B-SpeculativeReasoner](https://huggingface.co/akhauriyash/DeepSeek-R1-Distill-Qwen-1.5B-SpeculativeReasoner) on the [akhauriyash/OpenR1_Math_SpecR_GRPO](https://huggingface.co/datasets/akhauriyash/OpenR1_Math_SpecR_GRPO) dataset.
17
  It has been trained using [TRL](https://github.com/huggingface/trl).
18
 
19
  ## Quick start
config.json CHANGED
@@ -22,7 +22,7 @@
22
  "tie_word_embeddings": false,
23
  "torch_dtype": "bfloat16",
24
  "transformers_version": "4.50.0",
25
- "use_cache": false,
26
  "use_mrope": false,
27
  "use_sliding_window": false,
28
  "vocab_size": 151665
 
22
  "tie_word_embeddings": false,
23
  "torch_dtype": "bfloat16",
24
  "transformers_version": "4.50.0",
25
+ "use_cache": true,
26
  "use_mrope": false,
27
  "use_sliding_window": false,
28
  "vocab_size": 151665