File size: 1,100 Bytes
26692fb
 
16425e7
26692fb
 
16425e7
 
 
 
 
 
 
 
58859d9
26692fb
ceee623
 
 
 
16425e7
ceee623
26692fb
16425e7
26692fb
ceee623
16425e7
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
base_model:
- deepseek-ai/DeepSeek-R1
library_name: transformers
tags:
- reasoning
- R1
- 1M
- fast
- Deca
- Deca-AI
- Deca-2
- Qwen
license: other
---
> [!NOTE]
> # **Deca 2 is now generally availible. We recommend you do not use this model and instead use [`deca-ai/2-mini`](https://huggingface.co/deca-ai/2-mini/) instead.**


![Deca 2 Banner](https://huggingface.co/deca-ai/2-mini-beta/resolve/main/banner.jpg)
The Deca 2 family of models, [no longer in BETA](https://huggingface.co/deca-ai/2-mini/), is built on cutting-edge architectures like DeepSeek R1, and Qwen 2, delivering extraordinary performance. With a focus on insane speed and high efficiency, Deca 2 is revolutionizing text generation and setting new standards in the industry. It also comes with a **1 million** context window.

As more capabilities are added, Deca 2 will evolve into a more powerful, any-to-any model in the future. While it’s focused on text generation for now, its foundation is designed to scale, bringing even more advanced functionalities to come.

* **2/14 Release:**
* Enhanced Instruction Following