Transformers
English
falcon

Text Generation WebUI says it supports GGML models?

#3
by illtellyoulater - opened

@TheBloke why you say GGML model don't work in text-generation-webui?

from https://github.com/oobabooga/text-generation-webui#ggml-models

"You can drop these directly into the models/ folder, making sure that the file name contains ggml somewhere and ends in .bin."

I don't. I say THESE GGML models don't work in text-generation-webui.

text-generation-webui's GGML support is provided by llama-cpp-python, which currently only supports the same GGML models as llama.cpp, which means Llama and OpenLlama models. Not Falcon, MPT, GPT-J, GPT-NeoX, StarCoder.

These Falcon models are using a particular GGML library - actually called GGCC now - which is currently not supported by any UI, but hopefully this will change soon.

This comment has been hidden

Sign up or log in to comment