|
--- |
|
base_model: |
|
- deepseek-ai/DeepSeek-R1 |
|
library_name: transformers |
|
tags: |
|
- reasoning |
|
- R1 |
|
- 1M |
|
- fast |
|
- Deca |
|
- Deca-AI |
|
- Deca-2 |
|
- Qwen |
|
license: other |
|
--- |
|
> [!NOTE] |
|
> # **Deca 2 is now generally availible. We recommend you do not use this model and instead use [`deca-ai/2-mini`](https://huggingface.co/deca-ai/2-mini/) instead.** |
|
|
|
|
|
 |
|
The Deca 2 family of models, [no longer in BETA](https://huggingface.co/deca-ai/2-mini/), is built on cutting-edge architectures like DeepSeek R1, and Qwen 2, delivering extraordinary performance. With a focus on insane speed and high efficiency, Deca 2 is revolutionizing text generation and setting new standards in the industry. It also comes with a **1 million** context window. |
|
|
|
As more capabilities are added, Deca 2 will evolve into a more powerful, any-to-any model in the future. While it’s focused on text generation for now, its foundation is designed to scale, bringing even more advanced functionalities to come. |
|
|
|
* **2/14 Release:** |
|
* Enhanced Instruction Following |