laineyyy commited on
Commit
6294b7a
·
verified ·
1 Parent(s): a615f9d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -20,6 +20,8 @@ Poro 2 was created in a collaboration between [AMD Silo AI](https://www.amd.com/
20
 
21
  This model demonstrates how continued pretraining can efficiently add new language capabilities to existing models while maintaining performance in the original domains. Through the combination of English and Finnish training data, we achieve a model that substantially outperforms the base Llama 3.1 8B model in Finnish while maintaining solid English proficiency.
22
 
 
 
23
  ## Poro 2 Model Family
24
 
25
  The Poro 2 model family includes both 8B and 70B models, and there are three different versions released of the Poro 2 models: a base model, a post-training SFT-only checkpoint, and the final instruct model which is the SFT model plus a round of DPO.
 
20
 
21
  This model demonstrates how continued pretraining can efficiently add new language capabilities to existing models while maintaining performance in the original domains. Through the combination of English and Finnish training data, we achieve a model that substantially outperforms the base Llama 3.1 8B model in Finnish while maintaining solid English proficiency.
22
 
23
+ For more details on our training and data curation process, check out our [Continued Pretraining Playbook](https://rocm.blogs.amd.com/artificial-intelligence/multilingual-continued-pretraining/README.html).
24
+
25
  ## Poro 2 Model Family
26
 
27
  The Poro 2 model family includes both 8B and 70B models, and there are three different versions released of the Poro 2 models: a base model, a post-training SFT-only checkpoint, and the final instruct model which is the SFT model plus a round of DPO.