π§ Model Overview
- This project provides an English-to-Chinese translation model based on the RWKV-V7 architecture, with approximately 0.4 billion parameters and 1.5 billion parameters.
- Model weight fine-tuning base on https://huggingface.co/BlinkDL/rwkv7-g1
- The model has been fully fine-tuned on translation tasks and demonstrates strong performance across various domains, especially in handling long sentences, technical terminology, and culturally nuanced expressions.
- Unlike traditional Transformer-based models, RWKV combines the sequential state-passing mechanism of RNNs with the parallel training capabilities of Transformers. This unique design enables efficient inference while maintaining powerful sequence modeling abilities, making it ideal for deployment on resource-constrained environments such as mobile devices, embedded systems, or edge computing platforms.
π¦ Install Dependencies
π’ For Nvidia CUDA
pip install torch rwkv gradio python-docx PyPDF2 chardet
π΄ For AMD ROCm
- set
os.environ["RWKV_CUDA_ON"] = '0'
pip install torch --index-url https://download.pytorch.org/whl/rocm6.3
pip install rwkv python-docx PyPDF2 chardet
π Run The demo
- Change line 20 in
webui_new.py
to you own model weights path
python webui_new.py
β οΈ Notice
- This model currently supports English β Chinese translation only.
- Now it support English β Chinese & Chinese β English ~~~
π‘ Key Advantages
- β Lightweight and Deployment-Friendly: Achieves high-quality translation with only 0.4B / 1.5B parameters.
- β Strong Long-Context Modeling: Supports input lengths up to 4096 tokens.
- β Low Memory Footprint: Ideal for edge devices, mobile apps, and embedded systems.
- β Multilingual Potential: Built upon a multilingual pre-training foundation, future versions may support more language pairs.
π Recommended Resources
- π Official RWKV Repo
- π§ͺ Official RWKV Website
- π§° Official RWKV project collection
- π¦ Official fine-tuning Repo
- π€ RWKV Runner
- π AI00 Web Server
π§© Developer Info
- Developer: Alic Li
- GitHub: https://github.com/Alic-Li
- Contact: [email protected]
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for Alic-Li/RWKV_v7_G1_Translate_ctx4096_20250620
Base model
BlinkDL/rwkv7-g1