T5 Hre Vietnamese translation 1.5
This model is a fine-tuned version of google-t5/t5-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0448
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
No log | 1.0 | 336 | 0.1288 |
0.1618 | 2.0 | 672 | 0.1270 |
0.1588 | 3.0 | 1008 | 0.1247 |
0.1588 | 4.0 | 1344 | 0.1218 |
0.1546 | 5.0 | 1680 | 0.1198 |
0.153 | 6.0 | 2016 | 0.1177 |
0.153 | 7.0 | 2352 | 0.1160 |
0.1494 | 8.0 | 2688 | 0.1136 |
0.1483 | 9.0 | 3024 | 0.1123 |
0.1483 | 10.0 | 3360 | 0.1098 |
0.1445 | 11.0 | 3696 | 0.1082 |
0.1416 | 12.0 | 4032 | 0.1062 |
0.1416 | 13.0 | 4368 | 0.1041 |
0.1411 | 14.0 | 4704 | 0.1025 |
0.1369 | 15.0 | 5040 | 0.1008 |
0.1369 | 16.0 | 5376 | 0.0994 |
0.1366 | 17.0 | 5712 | 0.0988 |
0.1335 | 18.0 | 6048 | 0.0968 |
0.1335 | 19.0 | 6384 | 0.0947 |
0.1341 | 20.0 | 6720 | 0.0932 |
0.129 | 21.0 | 7056 | 0.0921 |
0.129 | 22.0 | 7392 | 0.0905 |
0.1269 | 23.0 | 7728 | 0.0886 |
0.1288 | 24.0 | 8064 | 0.0869 |
0.1288 | 25.0 | 8400 | 0.0862 |
0.1251 | 26.0 | 8736 | 0.0847 |
0.1238 | 27.0 | 9072 | 0.0843 |
0.1238 | 28.0 | 9408 | 0.0826 |
0.1225 | 29.0 | 9744 | 0.0806 |
0.1205 | 30.0 | 10080 | 0.0793 |
0.1205 | 31.0 | 10416 | 0.0793 |
0.1196 | 32.0 | 10752 | 0.0775 |
0.1201 | 33.0 | 11088 | 0.0763 |
0.1201 | 34.0 | 11424 | 0.0753 |
0.1166 | 35.0 | 11760 | 0.0742 |
0.1152 | 36.0 | 12096 | 0.0733 |
0.1152 | 37.0 | 12432 | 0.0721 |
0.1168 | 38.0 | 12768 | 0.0713 |
0.1151 | 39.0 | 13104 | 0.0704 |
0.1151 | 40.0 | 13440 | 0.0696 |
0.1117 | 41.0 | 13776 | 0.0678 |
0.113 | 42.0 | 14112 | 0.0676 |
0.113 | 43.0 | 14448 | 0.0662 |
0.1103 | 44.0 | 14784 | 0.0658 |
0.1105 | 45.0 | 15120 | 0.0655 |
0.1105 | 46.0 | 15456 | 0.0641 |
0.1104 | 47.0 | 15792 | 0.0635 |
0.1082 | 48.0 | 16128 | 0.0628 |
0.1082 | 49.0 | 16464 | 0.0622 |
0.1075 | 50.0 | 16800 | 0.0618 |
0.1059 | 51.0 | 17136 | 0.0611 |
0.1059 | 52.0 | 17472 | 0.0600 |
0.1064 | 53.0 | 17808 | 0.0597 |
0.1051 | 54.0 | 18144 | 0.0589 |
0.1051 | 55.0 | 18480 | 0.0580 |
0.104 | 56.0 | 18816 | 0.0579 |
0.1036 | 57.0 | 19152 | 0.0572 |
0.1036 | 58.0 | 19488 | 0.0562 |
0.1016 | 59.0 | 19824 | 0.0562 |
0.1008 | 60.0 | 20160 | 0.0553 |
0.1008 | 61.0 | 20496 | 0.0551 |
0.1017 | 62.0 | 20832 | 0.0543 |
0.1 | 63.0 | 21168 | 0.0537 |
0.0983 | 64.0 | 21504 | 0.0532 |
0.0983 | 65.0 | 21840 | 0.0526 |
0.0973 | 66.0 | 22176 | 0.0523 |
0.0988 | 67.0 | 22512 | 0.0518 |
0.0988 | 68.0 | 22848 | 0.0514 |
0.0973 | 69.0 | 23184 | 0.0511 |
0.0961 | 70.0 | 23520 | 0.0510 |
0.0961 | 71.0 | 23856 | 0.0500 |
0.0957 | 72.0 | 24192 | 0.0496 |
0.0955 | 73.0 | 24528 | 0.0499 |
0.0955 | 74.0 | 24864 | 0.0491 |
0.0948 | 75.0 | 25200 | 0.0488 |
0.0942 | 76.0 | 25536 | 0.0486 |
0.0942 | 77.0 | 25872 | 0.0486 |
0.0921 | 78.0 | 26208 | 0.0479 |
0.0935 | 79.0 | 26544 | 0.0475 |
0.0935 | 80.0 | 26880 | 0.0478 |
0.0927 | 81.0 | 27216 | 0.0473 |
0.0933 | 82.0 | 27552 | 0.0471 |
0.0933 | 83.0 | 27888 | 0.0467 |
0.0916 | 84.0 | 28224 | 0.0464 |
0.0922 | 85.0 | 28560 | 0.0463 |
0.0922 | 86.0 | 28896 | 0.0462 |
0.0922 | 87.0 | 29232 | 0.0458 |
0.0914 | 88.0 | 29568 | 0.0457 |
0.0914 | 89.0 | 29904 | 0.0456 |
0.0897 | 90.0 | 30240 | 0.0454 |
0.088 | 91.0 | 30576 | 0.0452 |
0.088 | 92.0 | 30912 | 0.0452 |
0.0923 | 93.0 | 31248 | 0.0451 |
0.0897 | 94.0 | 31584 | 0.0450 |
0.0897 | 95.0 | 31920 | 0.0449 |
0.0901 | 96.0 | 32256 | 0.0449 |
0.0901 | 97.0 | 32592 | 0.0449 |
0.0901 | 98.0 | 32928 | 0.0448 |
0.0901 | 99.0 | 33264 | 0.0448 |
0.0893 | 100.0 | 33600 | 0.0448 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 89
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.