Text Generation
PyTorch
English
olmo2
conversational
natolambert commited on
Commit
2c8ee0b
·
verified ·
1 Parent(s): b38592b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -56,13 +56,15 @@ olmo_model = AutoModelForCausalLM.from_pretrained("allenai/OLMo-2-0325-32B-DPO")
56
 
57
  ### Chat template
58
 
 
 
59
  The chat template for our models is formatted as:
60
  ```
61
- <|endoftext|><|user|>\nHow are you doing?\n<|assistant|>\nI'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
62
  ```
63
  Or with new lines expanded:
64
  ```
65
- <|endoftext|><|user|>
66
  How are you doing?
67
  <|assistant|>
68
  I'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
 
56
 
57
  ### Chat template
58
 
59
+ *NOTE: This is different than previous OLMo 2 and Tülu 3 models due to a minor change in configuration. It does NOT have the bos token before the rest. Our other models have <|endoftext|> at the beginning of the chat template.*
60
+
61
  The chat template for our models is formatted as:
62
  ```
63
+ <|user|>\nHow are you doing?\n<|assistant|>\nI'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>
64
  ```
65
  Or with new lines expanded:
66
  ```
67
+ <|user|>
68
  How are you doing?
69
  <|assistant|>
70
  I'm just a computer program, so I don't have feelings, but I'm functioning as expected. How can I assist you today?<|endoftext|>