New discussion

fp8 inference

1
#26 opened 12 months ago by
Melody32768

wrong model

#25 opened 12 months ago by
sunhaha123

Update README.md

#24 opened 12 months ago by
WBD8

Unet?

#22 opened about 1 year ago by
aiRabbit0

How to load into VRAM?

2
#19 opened about 1 year ago by
MicahV

'float8_e4m3fn' attribute error

6
#17 opened about 1 year ago by
Magenta6

Loading flux-fp8 with diffusers

1
#16 opened about 1 year ago by
8au

Quantization Method?

👍 2
10
#7 opened about 1 year ago by
vyralsurfer

ComfyUi Workflow

1
#6 opened about 1 year ago by
Jebari

Diffusers?

19
#4 opened about 1 year ago by
tintwotin

Minimum vram requirements?

3
#3 opened about 1 year ago by
joachimsallstrom

FP16

1
1
#2 opened about 1 year ago by
bsbsbsbs112321

Metadata lost from model

4
#1 opened about 1 year ago by
mcmonkey