250928 low noise:lora key not loaded issue
lora key not loaded: blocks.0.cross_attn.k.alpha
lora key not loaded: blocks.0.cross_attn.k.lora_down.weight
lora key not loaded: blocks.0.cross_attn.k.lora_up.weight
lora key not loaded: blocks.0.cross_attn.o.alpha
lora key not loaded: blocks.0.cross_attn.o.lora_down.weight
lora key not loaded: blocks.0.cross_attn.o.lora_up.weight
lora key not loaded: blocks.0.cross_attn.q.alpha
lora key not loaded: blocks.0.cross_attn.q.lora_down.weight
lora key not loaded: blocks.0.cross_attn.q.lora_up.weight
yep
lora key not loaded: blocks.0.cross_attn.k.alpha
lora key not loaded: blocks.0.cross_attn.k.lora_down.weight
lora key not loaded: blocks.0.cross_attn.k.lora_up.weight
lora key not loaded: blocks.0.cross_attn.o.alpha
lora key not loaded: blocks.0.cross_attn.o.lora_down.weight
lora key not loaded: blocks.0.cross_attn.o.lora_up.weight
lora key not loaded: blocks.0.cross_attn.q.alpha
lora key not loaded: blocks.0.cross_attn.q.lora_down.weight
lora key not loaded: blocks.0.cross_attn.q.lora_up.weight
Hi, could you please us more information?
Are you using comfyUI or our codebase, could you please provide us your workflow if you are using comfyUI.
same issue here. using comfy UI 0.3.60
lora key not loaded: blocks.0.cross_attn.k.alpha
lora key not loaded: blocks.0.cross_attn.k.lora_down.weight
lora key not loaded: blocks.0.cross_attn.k.lora_up.weight
lora key not loaded: blocks.0.cross_attn.o.alpha
lora key not loaded: blocks.0.cross_attn.o.lora_down.weight
lora key not loaded: blocks.0.cross_attn.o.lora_up.weight
lora key not loaded: blocks.0.cross_attn.q.alpha
lora key not loaded: blocks.0.cross_attn.q.lora_down.weight
lora key not loaded: blocks.0.cross_attn.q.lora_up.weightHi, could you please us more information?
Are you using comfyUI or our codebase, could you please provide us your workflow if you are using comfyUI.
comfyui的官方流的低噪加上新的低噪lora就会报出这个错误,流就是你们的示例流就能复现(https://huggingface.co/lightx2v/Wan2.2-Lightning/blob/main/Wan2.2-T2V-A14B-4steps-lora-rank64-Seko-V1.1/Wan2.2-T2V-A14B-4steps-lora-rank64-Seko-V1.1-NativeComfy.json)
When using GGUF and FP8_scaled from Comfy-org, the following warning appears:
Using scaled fp8: fp8 matrix mult: True, scale input: True
model weight dtype torch.float16, manual cast: None
model_type FLOW
lora key not loaded: blocks.0.cross_attn.k.alpha
lora key not loaded: blocks.0.cross_attn.k.lora_down.weight
lora key not loaded: blocks.0.cross_attn.k.lora_up.weight
lora key not loaded: blocks.0.cross_attn.o.alpha
lora key not loaded: blocks.0.cross_attn.o.lora_down.weight
lora key not loaded: blocks.0.cross_attn.o.lora_up.weight
lora key not loaded: blocks.0.cross_attn.q.alpha
lora key not loaded: blocks.0.cross_attn.q.lora_down.weight
lora key not loaded: blocks.0.cross_attn.q.lora_up.weight
lora key not loaded: blocks.0.cross_attn.v.alpha
lora key not loaded: blocks.0.cross_attn.v.lora_down.weight
lora key not loaded: blocks.0.cross_attn.v.lora_up.weight
lora key not loaded: blocks.0.ffn.0.alpha
lora key not loaded: blocks.0.ffn.0.lora_down.weight
Finally result:
same isseue why is this happening ?
Try the kijai's version, they seem to be working
https://huggingface.co/Kijai/WanVideo_comfy/tree/main/LoRAs/Wan22-Lightning
Try the kijai's version, they seem to be working
https://huggingface.co/Kijai/WanVideo_comfy/tree/main/LoRAs/Wan22-Lightning
this seems work! but its very saturated, is there way to fix it?
lora key not loaded: blocks.0.cross_attn.k.alpha
lora key not loaded: blocks.0.cross_attn.k.lora_down.weight
lora key not loaded: blocks.0.cross_attn.k.lora_up.weight
lora key not loaded: blocks.0.cross_attn.o.alpha
lora key not loaded: blocks.0.cross_attn.o.lora_down.weight
lora key not loaded: blocks.0.cross_attn.o.lora_up.weight
lora key not loaded: blocks.0.cross_attn.q.alpha
lora key not loaded: blocks.0.cross_attn.q.lora_down.weight
lora key not loaded: blocks.0.cross_attn.q.lora_up.weight
We just re-upload the weights and it now works for both workflows.
lora key not loaded: blocks.0.cross_attn.k.alpha
lora key not loaded: blocks.0.cross_attn.k.lora_down.weight
lora key not loaded: blocks.0.cross_attn.k.lora_up.weight
lora key not loaded: blocks.0.cross_attn.o.alpha
lora key not loaded: blocks.0.cross_attn.o.lora_down.weight
lora key not loaded: blocks.0.cross_attn.o.lora_up.weight
lora key not loaded: blocks.0.cross_attn.q.alpha
lora key not loaded: blocks.0.cross_attn.q.lora_down.weight
lora key not loaded: blocks.0.cross_attn.q.lora_up.weightWe just re-upload the weights and it now works for both workflows.
ITS WORKING!! THANK YOU. :) hoping for I2V:)
thank you for your transformative work
thank you for your transformative work
Thank you very much for your appreciation.
There is still a performance gap between the distilled model and the base model. Therefore, we used the date instead of a version number to identify this model.
Our quantitative evaluation on an in-house test set shows that the motion dynamics scores for the base model (wan2.2-T2V-A14B), Lora-250928, and Lora-v1.1 are 10.66, 7.76, and 5.27, respectively.
We are currently working on an improved version.