# Prompt templates
> [!WARNING]
> We now recommend using the `tokenizer` field to get the chat template directly from the hub. Just set it to your model id on the hub to automatically get the template.
These are the templates used to format the conversation history for different models used in HuggingChat. Set them in your `.env.local` [like so](https://github.com/huggingface/chat-ui#chatprompttemplate).
## Llama 2
```env
[INST] <>\n{{preprompt}}\n<>\n\n{{#each messages}}{{#ifUser}}{{content}} [/INST] {{/ifUser}}{{#ifAssistant}}{{content}} [INST] {{/ifAssistant}}{{/each}}
```
## CodeLlama
```env
[INST] <>\n{{preprompt}}\n<>\n\n{{#each messages}}{{#ifUser}}{{content}} [/INST] {{/ifUser}}{{#ifAssistant}}{{content}} [INST] {{/ifAssistant}}{{/each}}
```
## Falcon
```env
System: {{preprompt}}\nUser:{{#each messages}}{{#ifUser}}{{content}}\nFalcon:{{/ifUser}}{{#ifAssistant}}{{content}}\nUser:{{/ifAssistant}}{{/each}}
```
## Mistral
```env
{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}} {{/ifAssistant}}{{/each}}
```
## Zephyr
```env
<|system|>\n{{preprompt}}\n{{#each messages}}{{#ifUser}}<|user|>\n{{content}}\n<|assistant|>\n{{/ifUser}}{{#ifAssistant}}{{content}}\n{{/ifAssistant}}{{/each}}
```
## IDEFICS
```env
{{#each messages}}{{#ifUser}}User: {{content}}{{/ifUser}}\nAssistant: {{#ifAssistant}}{{content}}\n{{/ifAssistant}}{{/each}}
```
## OpenChat
```env
{{#each messages}}{{#ifUser}}GPT4 User: {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}}<|end_of_turn|>GPT4 Assistant: {{/ifUser}}{{#ifAssistant}}{{content}}<|end_of_turn|>{{/ifAssistant}}{{/each}}
```
## Mixtral
```env
{{#each messages}}{{#ifUser}}[INST]{{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}} {{content}} {{/ifAssistant}}{{/each}}
```
## ChatML
```env
{{#if @root.preprompt}}<|im_start|>system\n{{@root.preprompt}}<|im_end|>\n{{/if}}{{#each messages}}{{#ifUser}}<|im_start|>user\n{{content}}<|im_end|>\n<|im_start|>assistant\n{{/ifUser}}{{#ifAssistant}}{{content}}<|im_end|>\n{{/ifAssistant}}{{/each}}
```
## CodeLlama 70B
```env
{{#if @root.preprompt}}Source: system\n\n {{@root.preprompt}} {{/if}}{{#each messages}}{{#ifUser}}Source: user\n\n {{content}} {{/ifUser}}{{#ifAssistant}}Source: assistant\n\n {{content}} {{/ifAssistant}}{{/each}}Source: assistant\nDestination: user\n\n ``
```
## Gemma
```env
{{#each messages}}{{#ifUser}}user\n{{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}}\nmodel\n{{/ifUser}}{{#ifAssistant}}{{content}}\n{{/ifAssistant}}{{/each}}
```