nielsr HF Staff commited on
Commit
01315bf
Β·
verified Β·
1 Parent(s): d261c4a

Improve model card: add project page link

Browse files

This PR adds a link to the project page and ensures consistent formatting across links and emojis in the model card.

Files changed (1) hide show
  1. README.md +10 -9
README.md CHANGED
@@ -3,23 +3,23 @@ base_model:
3
  - Qwen/Qwen2.5-3B-Instruct
4
  datasets:
5
  - ulab-ai/Time-Bench
 
6
  license: apache-2.0
 
7
  tags:
8
  - temporal-reasoning
9
  - reinforcement-learning
10
  - large-language-models
11
  paperswithcode:
12
  arxiv_id: 2505.13508
13
- library_name: transformers
14
- pipeline_tag: text-generation
15
  ---
16
 
17
  <center>
18
- Β  Β  <img src="https://cdn-uploads.huggingface.co/production/uploads/65d188a4aa309d842e438ef1/d6YiWBndm7WzANfl3e1qi.png" alt="Output Examples" width="600">
19
  </center>
20
 
21
  <div align="center">
22
- <a href="https://huggingface.co/datasets/ulab-ai/Time-Bench"> πŸ“Š <strong>Dataset</strong></a> | <a href="https://github.com/ulab-uiuc/Time-R1">πŸš€ <strong>Code</strong></a> | <a href="https://arxiv.org/abs/2505.13508">πŸ“– <strong>Paper</strong></a>
23
  </div>
24
 
25
  # Time-R1 Model Series
@@ -74,8 +74,9 @@ model = AutoModelForCausalLM.from_pretrained(model_name)
74
  ## Citations
75
  ```bibtex
76
  @article{liu2025time,
77
- Β  title={Time-R1: Towards Comprehensive Temporal Reasoning in LLMs},
78
- Β  author={Liu, Zijia and Han, Peixuan and Yu, Haofei and Li, Haoru and You, Jiaxuan},
79
- Β  journal={arXiv preprint arXiv:2505.13508},
80
- Β  year={2025}
81
- }
 
 
3
  - Qwen/Qwen2.5-3B-Instruct
4
  datasets:
5
  - ulab-ai/Time-Bench
6
+ library_name: transformers
7
  license: apache-2.0
8
+ pipeline_tag: text-generation
9
  tags:
10
  - temporal-reasoning
11
  - reinforcement-learning
12
  - large-language-models
13
  paperswithcode:
14
  arxiv_id: 2505.13508
 
 
15
  ---
16
 
17
  <center>
18
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/65d188a4aa309d842e438ef1/d6YiWBndm7WzANfl3e1qi.png" alt="Output Examples" width="600">
19
  </center>
20
 
21
  <div align="center">
22
+ <a href="https://huggingface.co/datasets/ulab-ai/Time-Bench"> πŸ“Š <strong>Dataset</strong></a> | <a href="https://github.com/ulab-uiuc/Time-R1">πŸš€ <strong>Code</strong></a> | <a href="https://arxiv.org/abs/2505.13508">πŸ“– <strong>Paper</strong></a> | <a href="https://sites.google.com/view/eagle-llm">🌐 <strong>Project Page</strong></a>
23
  </div>
24
 
25
  # Time-R1 Model Series
 
74
  ## Citations
75
  ```bibtex
76
  @article{liu2025time,
77
+ title={Time-R1: Towards Comprehensive Temporal Reasoning in LLMs},
78
+ author={Liu, Zijia and Han, Peixuan and Yu, Haofei and Li, Haoru and You, Jiaxuan},
79
+ journal={arXiv preprint arXiv:2505.13508},
80
+ year={2025}
81
+ }
82
+ ```