WenetSpeech-Yue
Collection
A Large-scale Cantonese Speech Corpus with Multi-dimensional Annotation
•
6 items
•
Updated
•
1
WenetSpeech-Yue: Demos; Paper; Github; HuggingFace
WenetSpeech-Yue TTS Models have been released!
This repository contains two versions of the TTS models:
Clone and install
git clone https://github.com/ASLP-lab/WenetSpeech-Yue.git
cd WenetSpeech-Yue/CosyVoice2-Yue
conda create -n cosyvoice python=3.10
conda activate cosyvoice
# pynini is required by WeTextProcessing, use conda to install it as it can be executed on all platform.
conda install -y -c conda-forge pynini==2.1.5
pip install -r requirements.txt -i https://mirrors.aliyun.com/pypi/simple/ --trusted-host=mirrors.aliyun.com
Model download
from huggingface_hub import snapshot_download
snapshot_download('ASLP-lab/Cosyvoice2-Yue', local_dir='pretrained_models/Cosyvoice2-Yue')
snapshot_download('ASLP-lab/Cosyvoice2-Yue-ZoengJyutGaai', local_dir='pretrained_models/Cosyvoice2-Yue-ZoengJyutGaai')
Usage
import sys
sys.path.append('third_party/Matcha-TTS')
from cosyvoice.cli.cosyvoice import CosyVoice, CosyVoice2
from cosyvoice.utils.file_utils import load_wav
import torchaudio
import opencc
# s2t
converter = opencc.OpenCC('s2t.json')
cosyvoice_base = CosyVoice2(
'pretrained_models/Cosyvoice2-Yue',
load_jit=False, load_trt=False, load_vllm=False, fp16=False
)
cosyvoice_zjg = CosyVoice2(
'pretrained_models/Cosyvoice2-Yue-ZoengJyutGaai',
load_jit=False, load_trt=False, load_vllm=False, fp16=False
)
prompt_speech_16k = load_wav('asset/sg_017_090.wav', 16000)
text = '收到朋友从远方寄嚟嘅生日礼物,嗰份意外嘅惊喜同埋深深嘅祝福令我心入面充满咗甜蜜嘅快乐,笑容好似花咁绽放。'
text = converter.convert(text)
for i, j in enumerate(cosyvoice_base.inference_instruct2(text, '用粤语说这句话', prompt_speech_16k, stream=False)):
torchaudio.save('base_{}.wav'.format(i), j['tts_speech'], cosyvoice.sample_rate)
for i, j in enumerate(cosyvoice_zjg.inference_instruct2(text, '用粤语说这句话', prompt_speech_16k, stream=False)):
torchaudio.save('zjg_{}.wav'.format(i), j['tts_speech'], cosyvoice.sample_rate)
If you are interested in leaving a message to our research team, feel free to email [email protected] or [email protected].