Full Wan2.1 14B VACE model, converted from fp16 to fp8_scaled, using this script.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for spacepxl/Wan2.1_VACE_14B_fp8_scaled

Finetuned
(3)
this model