laineyyy commited on
Commit
7495b5f
·
verified ·
1 Parent(s): 226d520

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -17,6 +17,8 @@ Poro 2 was created in a collaboration between [AMD Silo AI](https://www.amd.com/
17
 
18
  This model demonstrates how continued pretraining followed by instruction tuning can efficiently add new language capabilities to existing models while maintaining strong conversational abilities in both the original and target languages.
19
 
 
 
20
  ## Poro 2 Model Family
21
 
22
  The Poro 2 model family includes both 8B and 70B models, and there are three different versions released of the Poro 2 models: a base model, a post-training SFT-only checkpoint, and the final instruct model which is the SFT model plus a round of DPO.
 
17
 
18
  This model demonstrates how continued pretraining followed by instruction tuning can efficiently add new language capabilities to existing models while maintaining strong conversational abilities in both the original and target languages.
19
 
20
+ For more details on our training and data generation pipeline, check out our [Continued Pretraining Playbook](https://rocm.blogs.amd.com/artificial-intelligence/multilingual-continued-pretraining/README.html).
21
+
22
  ## Poro 2 Model Family
23
 
24
  The Poro 2 model family includes both 8B and 70B models, and there are three different versions released of the Poro 2 models: a base model, a post-training SFT-only checkpoint, and the final instruct model which is the SFT model plus a round of DPO.