How is the model being loaded, i am unable to run this inference in offline mode
#6
by
sanjoy2
- opened
Is this model not allowed to use offline ?
It seems the model is loaded using custom loader.
I am unable to run this.
Please help
The local model path is being treated as repo id, tried updating the local_path and trust remote code.. and all possible ways
I am having the same problem