polish-roberta-large-v1

An encoder model based on the RoBERTa architecture, pre-trained on a large corpus of Polish texts. More information can be found in our GitHub repository and in the publication Pre-training polish transformer-based language models at scale.

Citation

@inproceedings{dadas2020pre,
  title={Pre-training polish transformer-based language models at scale},
  author={Dadas, S{\l}awomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}},
  booktitle={International Conference on Artificial Intelligence and Soft Computing},
  pages={301--314},
  year={2020},
  organization={Springer}
}
Downloads last month
8
Safetensors
Model size
355M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including sdadas/polish-roberta-large-v1