24GB VRAM out of memory
#1
by
OrangeApples
- opened
I'm trying to load this model on my 3090 but I keep getting CUDA out of memory error in text-gen-ui even with 8-bit cache enabled. I'm wondering if there's something wrong with my setup specifically, or if this model really does require more memory.