Always check the tokenizer.json for the number of tokens, then set vocab siz where appropriate to this to convert successfully with a working gguf