Quick question about prompting format
#2
by
actuallyasriel
- opened
When you say that this model uses ChatML's prompting format, but not the special token, what do you mean by "special token" in this case?
Sorry if this is kind of a noob question, I'm just trying to see if I need to alter something in SillyTavern's ChatML presets for this to work properly (as I'm getting ### Response: in outputs despite that not being in the prompt at all.)
Yeah, in SillyTavern, I try to use the ChatML presets. You might have to check "Use as stop strings" to keep from getting that bleed over.
yeah i can never get those to work even on models that explicitly suggest them; maybe it's the "use as stop strings" thing that i need