Text Generation
Transformers
Safetensors
English
olmo2
conversational
Inference Endpoints
natolambert commited on
Commit
b960243
·
verified ·
1 Parent(s): b11c777

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -57,13 +57,15 @@ olmo_model = AutoModelForCausalLM.from_pretrained("allenai/OLMo-2-0325-32B-Instr
57
 
58
  ### Chat template
59
 
 
 
60
  The chat template for our models is formatted as:
61
  ```
62
- <|endoftext|><|user|>\nHow are you doing?\n<|assistant|>\nI'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
63
  ```
64
  Or with new lines expanded:
65
  ```
66
- <|endoftext|><|user|>
67
  How are you doing?
68
  <|assistant|>
69
  I'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
 
57
 
58
  ### Chat template
59
 
60
+ *NOTE: This is different than previous OLMo 2 and Tülu 3 models due to a minor change in configuration. It does NOT have the bos token before the rest. Our other models have <|endoftext|> at the beginning of the chat template.*
61
+
62
  The chat template for our models is formatted as:
63
  ```
64
+ <|user|>\nHow are you doing?\n<|assistant|>\nI'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
65
  ```
66
  Or with new lines expanded:
67
  ```
68
+ <|user|>
69
  How are you doing?
70
  <|assistant|>
71
  I'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>