does it work with TensorRT for dev and schnell?
does it work with TensorRT?
Any minimal example available?
It seem onnx some operator not support torch.bfloat16 data type
Could this somehow help? https://github.com/microsoft/onnxscript/pull/1492
See also: https://github.com/microsoft/onnxruntime/issues/13001
and: https://github.com/microsoft/onnxscript/issues/1462
I found ComfyUI TensorRT node (https://github.com/comfyanonymous/ComfyUI_TensorRT/tree/master) support the bf16 Flux, but ComfyUI only supports t5xxl_fp16 (https://comfyanonymous.github.io/ComfyUI_examples/flux/).
Maybe the TensorRT node only convert the transformer, while the text encoders and vae need to be converted by other methods.
I found ComfyUI TensorRT node (https://github.com/comfyanonymous/ComfyUI_TensorRT/tree/master) support the bf16 Flux, but ComfyUI only supports t5xxl_fp16 (https://comfyanonymous.github.io/ComfyUI_examples/flux/).
Maybe the TensorRT node only convert the transformer, while the text encoders and vae need to be converted by other methods.
did you manage to convert only flux1_dev to onnx ? without text encoders.