DeepSeek-R1-Distill-Qwen-14B — RKLLM build for RK3588 boards
Built with DeepSeek (DeepSeek License Agreement)
Author: @jamescallander
Source model: deepseek-ai/DeepSeek-R1-Distill-Qwen-14B · Hugging Face
Target: Rockchip RK3588 NPU via RKNN-LLM Runtime
This repository hosts a conversion of
DeepSeek-R1-Distill-Qwen-14B
for use on Rockchip RK3588 single-board computers (Orange Pi 5 plus, Radxa Rock 5b+, Banana Pi M7, etc.). Conversion was performed using the RKNN-LLM toolkit
Conversion details
- RKLLM-Toolkit version: v1.2.1
- NPU driver: v0.9.8
- Python: 3.12
- Quantization:
w8a8_g128
- Output: single-file
.rkllm
artifact - Tokenizer: not required at runtime (UI handles prompt I/O)
Intended use
- On-device inference on RK3588 SBCs.
- Reasoning-focused model — designed to handle multi-step thinking, problem-solving, and structured explanations.
- Well-suited for tasks that need step-by-step reasoning or more careful breakdowns than typical instruction models.
Limitations
- Requires 16GB free memory
- Quantized build (
w8a8_g128
) may show small quality differences vs. full-precision upstream. - Tested on Radxa Rock 5B+; other devices may require different drivers/toolkit versions.
- While strong at reasoning, performance is limited by RK3588’s NPU compared to high-end GPUs.
Quick start (RK3588)
1) Install runtime
The RKNN-LLM toolkit and instructions can be found on the specific development board's manufacturer website or from airockchip's github page.
Download and install the required packages as per the toolkit's instructions.
2) Simple Flask server deployment
The simplest way the deploy the .rkllm
converted model is using an example script provided in the toolkit in this directory: rknn-llm/examples/rkllm_server_demo
python3 <TOOLKIT_PATH>/rknn-llm/examples/rkllm_server_demo/flask_server.py \
--rkllm_model_path <MODEL_PATH>/DeepSeek-R1-Distill-Qwen-14B_w8a8_g128_rk3588.rkllm \
--target_platform rk3588
3) Sending a request
A basic format for message request is:
{
"model":"DeepSeek-R1-Distill-Qwen-14B",
"messages":[{
"role":"user",
"content":"<YOUR_PROMPT_HERE>"}],
"stream":false
}
Example request using curl
:
curl -s -X POST <SERVER_IP_ADDRESS>:8080/rkllm_chat \
-H 'Content-Type: application/json' \
-d '{"model":"DeepSeek-R1-Distill-Qwen-14B","messages":[{"role":"user","content":"In 2 or 3 sentences, who was Napoleon Bonaparte?"}],"stream":false}'
The response is formated in the following way:
{
"choices":[{
"finish_reason":"stop",
"index":0,
"logprobs":null,
"message":{
"content":"<MODEL_REPLY_HERE">,
"role":"assistant"}}],
"created":null,
"id":"rkllm_chat",
"object":"rkllm_chat",
"usage":{
"completion_tokens":null,
"prompt_tokens":null,
"total_tokens":null}
}
Example response:
{"choices":[{"finish_reason":"stop","index":0,"logprobs":null,"message":{"content":"Okay, so I need to figure out who Napoleon Bonaparte was in a couple of sentences. Let me start by recalling what I know about him. He was a significant historical figure, probably from France. I think he was involved in the French Revolution somehow. Maybe he became a leader or something after the revolution. Wait, wasn't he a military leader? Yeah, I remember that he led some successful campaigns and became Emperor of France. His full name might have been Napoleon I, part of the Bonaparte dynasty. He had a big impact on Europe during his time, maybe in the early 19th century. I also think he was exiled at least once, like on Elba, and then tried to come back but got defeated again. There's something about the Battle of Waterloo where he lost. His influence must have been both positive and negative—maybe he brought some reforms but also caused a lot of wars across Europe. So putting it all together: Napoleon Bonaparte was a French military leader who rose to power after the revolution, became Emperor, led many battles, had significant political and cultural impacts, and was eventually defeated. He's remembered for his military genius and the Napoleonic Code, but also for causing widespread conflict. </think> Napoleon Bonaparte was a prominent French military leader and political figure who rose to power during the French Revolution. He became Emperor of France, leading numerous successful campaigns across Europe and implementing significant reforms, including the Napoleonic Code. Despite his influence and achievements, he was eventually defeated in the Battle of Waterloo, marking the end of his rule and his exile.","role":"assistant"}}],"created":null,"id":"rkllm_chat","object":"rkllm_chat","usage":{"completion_tokens":null,"prompt_tokens":null,"total_tokens":null}}
Note on reasoning traces
This model outputs intermediate reasoning text (e.g., chains of thought) before its final response, enclosed by </think>
markers.
- Many OpenAI-compatible UIs automatically suppress or hide this internal reasoning.
- If your client does not, you may see the reasoning steps along with the final answer.
4) UI compatibility
This server exposes an OpenAI-compatible Chat Completions API.
You can connect it to any OpenAI-compatible client or UI (for example: Open WebUI)
- Configure your client with the API base:
http://<SERVER_IP_ADDRESS>:8080
and use the endpoint:/rkllm_chat
- Make sure the
model
field matches the converted model’s name, for example:
{
"model": "DeepSeek-R1-Distill-Qwen-14B",
"messages": [{"role":"user","content":"Hello!"}],
"stream": false
}
License
This conversion follows the MIT License
- Attribution: Built with DeepSeek-R1-Distill-Qwen-14B (DeepSeek-AI)
- Required notice: see
NOTICE
- Modifications: quantization (w8a8_g128), export to
.rkllm
format for RK3588 SBCs
- Downloads last month
- 5
Model tree for jamescallander/DeepSeek-R1-Distill-Qwen-14B_w8a8_g128_rk3588.rkllm
Base model
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B