NLPblue commited on
Commit
a92f778
·
verified ·
1 Parent(s): f991b90

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -3
README.md CHANGED
@@ -1,3 +1,29 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+
5
+ # Introduction
6
+
7
+ We present **Tongyi-DeepResearch**, an agentic large language model featuring 30 billion total parameters, with only 3 billion activated per token. Developed by Tongyi Lab, the model is specifically designed for **long-horizon, deep information-seeking** tasks. Tongyi-DeepResearch demonstrates state-of-the-art performance across a range of agentic search benchmarks, including BrowserComp-EN, BrowserComp-ZH, GAIA, Humanity's Last Exam, xbench-DeepSearch, and WebWalkerQA.
8
+
9
+
10
+ ## Key Features
11
+
12
+ - ⚙️ **Fully automated synthetic data generation pipeline**: Covers both the pre-training stage (data creation, filtering, and scaling) and the post-training stage (evaluation, refinement, and filtering).
13
+ - 🔄 **Large-scale continual pre-training on agentic data**: Leveraging diverse, high-quality agentic interaction data to extend model capabilities, maintain freshness, and strengthen reasoning performance.
14
+ - 🔁 **End-to-end reinforcement learning**: We employ a strictly on-policy RL approach based on a customized Group Relative Policy Optimization framework, with token-level policy gradients, leave-one-out advantage estimation, and selective filtering of negative samples to stabilize training in a non‑stationary environment.
15
+ - 🤖 **Agent Inference Paradigm Compatibility**: At inference, Tongyi-DeepResearch is compatible with two inference paradigms: ReAct, for rigorously evaluating the model's core intrinsic abilities, and an IterResearch-based 'Heavy' mode, which uses a test-time scaling strategy to unlock the model's maximum performance ceiling.
16
+
17
+ ## Download
18
+
19
+ You can download the model then run the inference scipts in https://github.com/Alibaba-NLP/DeepResearch.
20
+
21
+
22
+ ```bibtex
23
+ @misc{tongyidr,
24
+ author={Tongyi DeepResearch Team},
25
+ title={Tongyi-DeepResearch},
26
+ year={2025},
27
+ howpublished={\url{https://github.com/Alibaba-NLP/DeepResearch}}
28
+ }
29
+ ```