File size: 1,093 Bytes
9e4f2f7 64ee594 9e4f2f7 64ee594 e0d5280 64ee594 d29d2ea c569e07 d29d2ea 64ee594 d29d2ea 64ee594 1ed2b2f 64ee594 738dbe5 6d97168 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
license: apache-2.0
datasets:
- Norquinal/claude_multiround_chat_30k
- ehartford/dolphin
- BAAI/COIG-PC
- Open-Orca/OpenOrca
- vikp/textbook_quality_programming
---
# RWKV v4 world 7B 65k context
This is the model to replace the old rwkv 65k claude model, with special token and lower learning rate to maintain model former abilities.
and trained a lots of English high quality textbooks and chinese novels with 65k context length.
using it with rwkv runner only need 16G vram.(https://github.com/josStorer/RWKV-Runner)
## contributor
[@KevinMr](https://huggingface.co/KevinMr)
[@Remixa](https://huggingface.co/Remixa)
## trainning details
https://wandb.ai/one-/one-rwkv-64k/runs/jn05hyc4
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/mpq2VrOaMZ_nXvV_yL-6o.png)
## Testcase
https://rwkv-next-web.ai-creator.net/ (temporary)
https://rwkv.ai-creator.net/risu
## how to use
use vocabs files in runner config
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/9V3J6uxaJESCC7WhIOD7p.png)
|