Update README.md
Browse files
README.md
CHANGED
@@ -38,5 +38,5 @@ note: the new `GGUF AudioEncoder Loader` on test; running gguf audio encoder `wa
|
|
38 |
|
39 |
## **reference**
|
40 |
- for the lite workflow (save >70% loading time), get the `lite lora` for 4/8-step operation [here](https://huggingface.co/calcuis/wan2-gguf/blob/main/wan2.2_s2v_lite_lora.safetensors)
|
41 |
-
- or opt to use scaled fp8 e4m3 safetensors
|
42 |
- gguf-node ([pypi](https://pypi.org/project/gguf-node)|[repo](https://github.com/calcuis/gguf)|[pack](https://github.com/calcuis/gguf/releases))
|
|
|
38 |
|
39 |
## **reference**
|
40 |
- for the lite workflow (save >70% loading time), get the `lite lora` for 4/8-step operation [here](https://huggingface.co/calcuis/wan2-gguf/blob/main/wan2.2_s2v_lite_lora.safetensors)
|
41 |
+
- or opt to use scaled fp8 e4m3 safetensors `audio encoder` [here](https://huggingface.co/chatpig/encoder/blob/main/wav2vec2_large_english_fp8_e4m3fn.safetensors) and/or fp8 e4m3 `vae` [here](https://huggingface.co/calcuis/wan2-gguf/blob/main/wan_2.1_vae_fp8_e4m3fn.safetensors) (don't even need to switch to native loaders as `GGUF AudioEncoder Loader` and `GGUF VAE Loader` support both gguf and fp8 scaled safetensors files)
|
42 |
- gguf-node ([pypi](https://pypi.org/project/gguf-node)|[repo](https://github.com/calcuis/gguf)|[pack](https://github.com/calcuis/gguf/releases))
|