GGUF
conversational

The Template file has a typo in {{.Response}} , only the .onse}} is present , causing model to not respond

#1
by mashriram - opened

6407d49d-a8f7-4329-a41e-64d69edeaf4c.png

TEMPLATE
FROM /usr/share/ollama/.ollama/models/blobs/sha256-4dbf02d64b32375935c4dfab74209fb045f9e2eb268d77bb158737c841c58dfe
TEMPLATE
"{{ if .System }}<*s*>[SYSTEM_PROMPT]Think deeply before answering the user's question. Do the thinking inside ... tags.
{{ .System }}[/SYSTEM_PROMPT]{{ end }}{{ if .Prompt }}[INST]{{ .Prompt }}[/INST]
{{ end }}onse }}<*/s*>"
PARAMETER stop <*s*>
PARAMETER stop
PARAMETER stop [INST]

as you see the onse is present

corrected template
{{ if .System }}<*s*>[SYSTEM_PROMPT]Think deeply before answering the user's question. Do the thinking inside ... tags.

{{ .System }}[/SYSTEM_PROMPT]{{ end }}{{ if .Prompt }}[INST]{{ .Prompt }}[/INST]
{{ .Response }}<*/s*>{{ end }}

Hi @mashriram

We hadn't tested it on ollama. We tried GGUF version directly with llama.cpp, where it was loading the template file correctly.
Will test with ollama and revert, but meanwhile, please try using with llama.cpp.

Sign up or log in to comment