does it work with TensorRT for dev and schnell?

#24
by froilo - opened

does it work with TensorRT?
Any minimal example available?

It seem onnx some operator not support torch.bfloat16 data type

I found ComfyUI TensorRT node (https://github.com/comfyanonymous/ComfyUI_TensorRT/tree/master) support the bf16 Flux, but ComfyUI only supports t5xxl_fp16 (https://comfyanonymous.github.io/ComfyUI_examples/flux/).

Maybe the TensorRT node only convert the transformer, while the text encoders and vae need to be converted by other methods.

I found ComfyUI TensorRT node (https://github.com/comfyanonymous/ComfyUI_TensorRT/tree/master) support the bf16 Flux, but ComfyUI only supports t5xxl_fp16 (https://comfyanonymous.github.io/ComfyUI_examples/flux/).

Maybe the TensorRT node only convert the transformer, while the text encoders and vae need to be converted by other methods.

did you manage to convert only flux1_dev to onnx ? without text encoders.

This comment has been hidden

Sign up or log in to comment