license: cc-by-sa-4.0 | |
language: | |
- pl | |
metrics: | |
- bleu | |
base_model: | |
- GreTa | |
library_name: transformers | |
datasets: | |
- mrapacz/greek-interlinear-translations | |
# Model Card for Ancient Greek to Polish Interlinear Translation Model | |
This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts. | |
## Model Details | |
### Model Description | |
- **Developed By:** Maciej Rapacz, AGH University of Kraków | |
- **Model Type:** Neural machine translation (T5-based) | |
- **Base Model:** GreTa | |
- **Tokenizer:** GreTa | |
- **Language(s):** Ancient Greek (source) → Polish (target) | |
- **License:** CC BY-NC-SA 4.0 | |
- **Tag Set:** BH (Bible Hub) | |
- **Text Preprocessing:** Normalized | |
- **Morphological Encoding:** emb-auto | |
### Model Performance | |
- **BLEU Score:** 46.01 | |
- **SemScore:** 0.91 | |
### Model Sources | |
- **Repository:** https://github.com/mrapacz/loreslm-interlinear-translation | |
- **Paper:** https://aclanthology.org/2025.loreslm-1.11/ | |