Indonesian T5 Language Models
Collection
Indonesian T5 models pre-trained with nanoT5 and fine-tuned on IndoNLG tasks. GitHub: https://github.com/LazarusNLP/IndoT5/
•
5 items
•
Updated
This model is a fine-tuned version of LazarusNLP/IndoNanoT5-base on the indonlg dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Bleu | Sacrebleu |
---|---|---|---|---|---|
1.9872 | 1.0 | 15516 | 1.8482 | 3.7015 | 3.7015 |
1.888 | 2.0 | 31032 | 1.8434 | 4.0409 | 4.0409 |
1.8207 | 3.0 | 46548 | 1.8347 | 4.1239 | 4.1239 |
1.7716 | 4.0 | 62064 | 1.8340 | 4.3231 | 4.3231 |
1.6948 | 5.0 | 77580 | 1.8443 | 4.4283 | 4.4283 |
1.6442 | 6.0 | 93096 | 1.8563 | 4.5338 | 4.5338 |
1.5856 | 7.0 | 108612 | 1.8782 | 4.3033 | 4.3033 |
1.5451 | 8.0 | 124128 | 1.8930 | 4.3286 | 4.3286 |
1.5056 | 9.0 | 139644 | 1.9207 | 4.2773 | 4.2773 |
1.446 | 10.0 | 155160 | 1.9406 | 4.0629 | 4.0629 |
1.406 | 11.0 | 170676 | 1.9636 | 4.1382 | 4.1382 |
Base model
LazarusNLP/IndoNanoT5-base