Blank String in System Prompt causes Error - chat_prompt needs some fixes!

#41
by calycekr - opened

It looks like the chat_prompt needs some fixes.

gemma-3 on vllm v0.8.2


Request

{
  "model": "gemma-3-27b-it",
  "messages": [
    {
      "role": "system",
      "content": ""
    },
    {
      "role": "user",
      "content": "Who are you?"
    }
  ],
  "stream": false,
  "max_tokens": 100
}

Response

{
  "object": "error",
  "message": "list object has no element 0",
  "type": "BadRequestError",
  "param": null,
  "code": 400
}

Removing the system prompt altogether will make it work.

Request

{
  "model": "gemma-3-27b-it",
  "messages": [
    {
      "role": "user",
      "content": "Who are you?"
    }
  ],
  "stream": false,
  "max_tokens": 100
}

Response

{
  "id": "chatcmpl-560ed12f1de94b45ad4933419586e161",
  "object": "chat.completion",
  "created": 1743042717,
  "model": "test",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "reasoning_content": null,
        "content": "Hi there! I’m Gemma, a large language model created by the Gemma team at Google DeepMind. I’m an open-weights model, which means I’m publicly available for use! \n\nI’m designed to take text and images as input and produce text as output. \n\nHow can I help you today?",
        "tool_calls": []
      },
      "logprobs": null,
      "finish_reason": "stop",
      "stop_reason": 106
    }
  ],
  "usage": {
    "prompt_tokens": 13,
    "total_tokens": 83,
    "completion_tokens": 70,
    "prompt_tokens_details": null
  },
  "prompt_logprobs": null
}
calycekr changed discussion title from Blank String in System Prompt causes Error to Blank String in System Prompt causes Error - chat_prompt needs some fixes!

This will fix it for now. I think there's a better way, though. Please reflect this in the repo.

{
    "chat_template": "{{- bos_token -}}\n{%- if messages[0]['role'] == 'system' %}\n    {%- if messages[0]['content'] %}\n        {%- if messages[0]['content'] is string %}\n            {%- set first_user_prefix = messages[0]['content'] + '\n\n' %}\n        {%- else %}\n            {%- set first_user_prefix = messages[0]['content'][0]['text'] + '\n\n' %}\n        {%- endif %}\n        {%- set loop_messages = messages[1:] %}\n    {%- else %}\n        {%- set first_user_prefix = '' %}\n        {%- set loop_messages = messages[1:] %}\n    {%- endif %}\n{%- else %}\n    {%- set first_user_prefix = '' %}\n    {%- set loop_messages = messages %}\n{%- endif %}\n{%- for message in loop_messages %}\n    {%- if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}\n        {{- raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') -}}\n    {%- endif %}\n    {%- if (message['role'] == 'assistant') %}\n        {%- set role = 'model' %}\n    {%- else %}\n        {%- set role = message['role'] %}\n    {%- endif %}\n    {{- '<start_of_turn>' + role + '\n' + (first_user_prefix if loop.first else '') -}}\n    {%- if message['content'] is string %}\n        {{- message['content'] | trim -}}\n    {%- elif message['content'] is iterable %}\n        {%- for item in message['content'] %}\n            {%- if item['type'] == 'image' %}\n                {{- '<start_of_image>' -}}\n            {%- elif item['type'] == 'text' %}\n                {{- item['text'] | trim -}}\n            {%- endif %}\n        {%- endfor %}\n    {%- else %}\n        {{- raise_exception('Invalid content type') -}}\n    {%- endif %}\n    {{- '<end_of_turn>\n' -}}\n{%- endfor %}\n{%- if add_generation_prompt %}\n    {{- '<start_of_turn>model\n' -}}\n{%- endif %}"
}

Hi @calycekr ,

Apologies for the late reply, welcome to Google Gemma family of open source model. Could you please confirm whether the issue is resolved or not by above comment. If you required any further assistance please let me know I'm more than happy to help you out.

Thanks.

Sign up or log in to comment