Unable to load with transformers library as config files are missing.

#4
by mlwithaarvy - opened

respective configs files are missing

this is a ggml version of the model, I'm afraid you can't use it with transformers.
try
https://huggingface.co/eachadea/vicuna-7b-1.1

Is there any guide to convert any of these GGMLs into API?
I am using flask and Transformers library but everything just works in the local terminal as indicated but not in flask via Transformers API. I have searched every possible source.

Is there any guide to convert any of these GGMLs into API?
I am using flask and Transformers library but everything just works in the local terminal as indicated but not in flask via Transformers API. I have searched every possible source.

Sure, it's actually pretty simple to get started (you don't need transformers for that):

  1. Download the model of your choice as ggml format and place it inside a local folder
  2. Install llama-cpp-python, which provides Python bindings for llama.cpp
  3. Follow the instructions in llama-cpp-python to call the model with python, make sure that this works
  4. Now instead make the call from inside the flask application

Sign up or log in to comment