Update README.md
Browse files
README.md
CHANGED
|
@@ -4,8 +4,9 @@ license: apache-2.0
|
|
| 4 |
|
| 5 |
# Introduction
|
| 6 |
|
| 7 |
-
We present **Tongyi DeepResearch**, an agentic large language model featuring 30 billion total parameters, with only 3 billion activated per token. Developed by Tongyi Lab, the model is specifically designed for **long-horizon, deep information-seeking** tasks. Tongyi-DeepResearch demonstrates state-of-the-art performance across a range of agentic search benchmarks, including BrowserComp
|
| 8 |
|
|
|
|
| 9 |
|
| 10 |

|
| 11 |
|
|
|
|
| 4 |
|
| 5 |
# Introduction
|
| 6 |
|
| 7 |
+
We present **Tongyi DeepResearch**, an agentic large language model featuring 30 billion total parameters, with only 3 billion activated per token. Developed by Tongyi Lab, the model is specifically designed for **long-horizon, deep information-seeking** tasks. Tongyi-DeepResearch demonstrates state-of-the-art performance across a range of agentic search benchmarks, including Humanity's Last Exam, BrowserComp, BrowserComp-ZH, WebWalkerQA, xbench-DeepSearch, FRAMES and SimpleQA.
|
| 8 |
|
| 9 |
+
More details can be found in our 📰 [Tech Blog](https://tongyi-agent.github.io/blog/introducing-tongyi-deep-research).
|
| 10 |
|
| 11 |

|
| 12 |
|