Fix to make Loras work with Torch Compile

#10
by cyan2k - opened

You just need to put the Lora node BEFORE Torch Compile and not after. You can't change compiled models after they are compiled, that will revert them and that's why it looks like it's not working.
You want to compile over the sum of lora+model.

Ill check that once i have time (; Thanks for the tip

Thanks!

I noticed something else you might want to take a look at, haha. It is about the distilled 0.9.7 model workflow.

When a model is labeled as "cfg distilled," it means it was distilled using a fixed CFG value. The distillation process bakes that guidance into the model itself, so changing the CFG during inference doesn't just have no effect, it can actually degrade output quality. This fixed-CFG design is also part of why the model runs faster: it avoids the need to calculate both the conditional and unconditional passes.

However, I noticed the current workflow in your repo doesn’t stick to that assumption. In the "STG Guider Advanced" settings, the marked line:

image.png

is the CFG configuration across the denoising steps. That’s correct and optimal for a cfg-distilled model. But if you're using settings like "13B dynamic" then you're deviating from the cfg=1 assumption, which can hurt quality.

That’s why in their distilled workflow, ltxv removed the "STG Presets Advanced" component and put the "STG Guider Advanced" to:

"widgets_values": [
  0.9970000000000002,
  true,
  "1.0, 0.9933, 0.9850, 0.9767, 0.9008, 0.6180",
  "1,1,1,1,1,1",
  "0,0,0,0,0,0",
  "1, 1, 1, 1, 1, 1",
  "[25], [35], [35], [42], [42], [42]"
]

to make sure the distilled model gets used with the intended configuration. Perhaps it would be possible to set the guider to these values.

Wouldn't change the output of your workflow since it would get overwritten by the Preset control anyway, but if poeple like to test the distilled model "as designed" they can easily do so by removing the link between preset and guider component.

Thanks, sorry for the wall of text :D

Ill change that later thanks for letting me know (;

Sign up or log in to comment