Text-to-Image
Diffusers
Safetensors

Min vram?

#3
by scraper01 - opened

Tried to load the model unto an 4060 mobile with 8gb VRAM.

Not up to it - inference time way over 25 min. Flash attention disabled because windows.

If i want this to run on windows, how much VRAM do i need to get reasonable inference times - circa 15-20s ?

Regards,

Andy.

Sign up or log in to comment