VoxCPM

This version of VoxCPM has been converted to run on the Axera NPU using w8a16 quantization. Compatible with Pulsar2 version: 4.2

Convert tools links:

For those who are interested in model conversion, you can try to export axmodel through the original repo : VoxCPM offical

Pulsar2 Link, How to Convert LLM from Huggingface to axmodel

AXera NPU HOST LLM Runtime

Support Platform

How to use

Download all files from this repository to the device

1. Run python demo

1. Install voxcpm axinfer package

git clone -b 1.0.4-axmode_infer https://github.com/techshoww/VoxCPM.git 
cd VoxCPM
pip3 install .

2. Download zipenhancer

pip3 install modelscope  
modelscope download --model iic/speech_zipenhancer_ans_multiloss_16k_base --local_dir iic/speech_zipenhancer_ans_multiloss_16k_base

3. Run on Axera Device

Go to the root directory of this project. run:

python3 run_ax650.py

2. Run c++ demo

1. Install transformers

pip3 install transformers>=4.56.2

2. Start tokenizer server

python3 tokenizer.py --port 9999

3. Run c++ demo

bash run_ax650.sh
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support