modularai/replit-code-1.5

#973
by Enderchef - opened

Quaints, please

It's queued! :D

You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#replit-code-v1_5-3b-GGUF for quants to appear.

The quant que says " -2000 7 si replit-code-v1_5-3b error/1 bpe-pt missing (5e15815f…)". Is the error normal?

The quant que says " -2000 7 si replit-code-v1_5-3b error/1 bpe-pt missing (5e15815f…)". Is the error normal?

No it unfortunately means that the model failed. In this case because the pre-tokenizer used is not supported by llama.cpp which can't be fixed unless you know of a similar supported pre-tokenizer will not cause any issues if used instead:

**************************************************************************************
** WARNING: The BPE pre-tokenizer was not recognized!
**          There are 2 possible reasons for this:
**          - the model has not been added to convert_hf_to_gguf_update.py yet
**          - the pre-tokenization config has changed upstream
**          Check your model files and convert_hf_to_gguf_update.py and update them accordingly.
** ref:     https://github.com/ggml-org/llama.cpp/pull/6920
**
** chkhsh:  5e15815fac4936080716d93a468f1da35ed671aa8478f226ab08ab6ad7af6a55
**************************************************************************************

Sign up or log in to comment