LLaMA weights(llama-13b-hf) + Vicuna weights (vicuna-13b-delta-v1.1) = Vicuna-13B

How to

pip3 install fschat
pip3 install git+https://github.com/huggingface/transformers
sudo apt install git git-lfs
git clone https://huggingface.co/myaniu/Vicuna-13B
python3 -m fastchat.serve.cli --model-path /path/to/Vicuna-13B

Downloads last month
42
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support