can't run inference on downloaded model
#1
by
kheiri
- opened
Tried LmStudio and oobabooga, both give me an error message. Tried Q6 and Q8 both same error message.
I have no issues running it, can you provide the error message?
I'm running in oobabooga btw
Ok. it's running now. I must have been doing something wrong. :)
kheiri
changed discussion status to
closed