classifier-de1_roberta
This model is a fine-tuned version of xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3372
- Accuracy: 0.8833
- Precision: 0.5359
- Recall: 0.4365
- F1: 0.4811
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
0.3438 | 0.0513 | 500 | 0.3804 | 0.8760 | 0.0 | 0.0 | 0.0 |
0.3152 | 0.1025 | 1000 | 0.3564 | 0.8760 | 0.0 | 0.0 | 0.0 |
0.2733 | 0.1538 | 1500 | 0.3493 | 0.8641 | 0.4201 | 0.2516 | 0.3147 |
0.2512 | 0.2050 | 2000 | 0.3356 | 0.8740 | 0.4853 | 0.2719 | 0.3485 |
0.2508 | 0.2563 | 2500 | 0.3366 | 0.8850 | 0.6206 | 0.1865 | 0.2868 |
0.2355 | 0.3075 | 3000 | 0.3326 | 0.8865 | 0.6478 | 0.1864 | 0.2895 |
0.2345 | 0.3588 | 3500 | 0.3157 | 0.8849 | 0.5887 | 0.2378 | 0.3388 |
0.2178 | 0.4100 | 4000 | 0.3392 | 0.8872 | 0.6468 | 0.1983 | 0.3035 |
0.2127 | 0.4613 | 4500 | 0.3141 | 0.8874 | 0.6065 | 0.2622 | 0.3661 |
0.2047 | 0.5125 | 5000 | 0.3483 | 0.8889 | 0.6783 | 0.1985 | 0.3071 |
0.2052 | 0.5638 | 5500 | 0.3345 | 0.8872 | 0.6006 | 0.2696 | 0.3722 |
0.2107 | 0.6150 | 6000 | 0.3218 | 0.8891 | 0.6591 | 0.2185 | 0.3282 |
0.1984 | 0.6663 | 6500 | 0.3035 | 0.8823 | 0.5412 | 0.3366 | 0.4151 |
0.1827 | 0.7175 | 7000 | 0.3234 | 0.8882 | 0.6177 | 0.2586 | 0.3646 |
0.1793 | 0.7688 | 7500 | 0.3269 | 0.8878 | 0.5983 | 0.2898 | 0.3904 |
0.1831 | 0.8200 | 8000 | 0.3357 | 0.8898 | 0.6267 | 0.2743 | 0.3816 |
0.1789 | 0.8713 | 8500 | 0.3173 | 0.8879 | 0.5894 | 0.3160 | 0.4114 |
0.1669 | 0.9225 | 9000 | 0.3233 | 0.8840 | 0.5511 | 0.3461 | 0.4252 |
0.1711 | 0.9738 | 9500 | 0.3129 | 0.8891 | 0.6062 | 0.3012 | 0.4024 |
0.1745 | 1.0250 | 10000 | 0.3120 | 0.8893 | 0.6091 | 0.2996 | 0.4016 |
0.1656 | 1.0763 | 10500 | 0.3257 | 0.8890 | 0.5918 | 0.3381 | 0.4303 |
0.1668 | 1.1275 | 11000 | 0.3153 | 0.8864 | 0.5722 | 0.3321 | 0.4203 |
0.1505 | 1.1788 | 11500 | 0.3281 | 0.8840 | 0.5454 | 0.3882 | 0.4535 |
0.156 | 1.2300 | 12000 | 0.3344 | 0.8898 | 0.6064 | 0.3169 | 0.4162 |
0.155 | 1.2813 | 12500 | 0.3131 | 0.8896 | 0.6016 | 0.3249 | 0.4219 |
0.1729 | 1.3325 | 13000 | 0.3272 | 0.8911 | 0.6177 | 0.3204 | 0.4219 |
0.1558 | 1.3838 | 13500 | 0.3506 | 0.8911 | 0.6308 | 0.2942 | 0.4012 |
0.1587 | 1.4350 | 14000 | 0.3281 | 0.8916 | 0.6270 | 0.3113 | 0.4160 |
0.1609 | 1.4863 | 14500 | 0.3158 | 0.8900 | 0.6071 | 0.3195 | 0.4187 |
0.1638 | 1.5375 | 15000 | 0.3101 | 0.8937 | 0.6589 | 0.2958 | 0.4083 |
0.1372 | 1.5888 | 15500 | 0.3230 | 0.8928 | 0.6656 | 0.2729 | 0.3871 |
0.1594 | 1.6400 | 16000 | 0.3206 | 0.8909 | 0.6096 | 0.3347 | 0.4321 |
0.1589 | 1.6913 | 16500 | 0.3155 | 0.8924 | 0.6320 | 0.3169 | 0.4221 |
0.1515 | 1.7425 | 17000 | 0.3077 | 0.8908 | 0.6107 | 0.3301 | 0.4285 |
0.157 | 1.7938 | 17500 | 0.3080 | 0.8875 | 0.5690 | 0.3822 | 0.4573 |
0.1531 | 1.8450 | 18000 | 0.3258 | 0.8929 | 0.6694 | 0.2695 | 0.3843 |
0.1552 | 1.8963 | 18500 | 0.3241 | 0.8876 | 0.5742 | 0.3626 | 0.4445 |
0.1546 | 1.9475 | 19000 | 0.3167 | 0.8876 | 0.5755 | 0.3559 | 0.4398 |
0.14 | 1.9988 | 19500 | 0.3388 | 0.8876 | 0.5764 | 0.3537 | 0.4384 |
0.1579 | 2.0500 | 20000 | 0.3307 | 0.8911 | 0.6242 | 0.3068 | 0.4114 |
0.1402 | 2.1013 | 20500 | 0.3213 | 0.8906 | 0.6165 | 0.3105 | 0.4130 |
0.1349 | 2.1525 | 21000 | 0.3446 | 0.8902 | 0.6054 | 0.3281 | 0.4256 |
0.1506 | 2.2038 | 21500 | 0.3214 | 0.8909 | 0.6154 | 0.3208 | 0.4218 |
0.1219 | 2.2550 | 22000 | 0.3467 | 0.8893 | 0.5943 | 0.3376 | 0.4306 |
0.1469 | 2.3063 | 22500 | 0.3270 | 0.8922 | 0.6260 | 0.3244 | 0.4273 |
0.1487 | 2.3575 | 23000 | 0.3187 | 0.8890 | 0.5809 | 0.3765 | 0.4569 |
0.1341 | 2.4088 | 23500 | 0.3306 | 0.8887 | 0.5815 | 0.3646 | 0.4482 |
0.135 | 2.4600 | 24000 | 0.3415 | 0.8892 | 0.5908 | 0.3464 | 0.4367 |
0.1293 | 2.5113 | 24500 | 0.3546 | 0.8935 | 0.6468 | 0.3100 | 0.4192 |
0.1278 | 2.5625 | 25000 | 0.3618 | 0.8914 | 0.6185 | 0.3231 | 0.4245 |
0.1116 | 2.6138 | 25500 | 0.3505 | 0.8906 | 0.6033 | 0.3446 | 0.4386 |
0.1296 | 2.6650 | 26000 | 0.3494 | 0.8909 | 0.6195 | 0.3118 | 0.4148 |
0.1487 | 2.7163 | 26500 | 0.3328 | 0.8908 | 0.5979 | 0.3637 | 0.4523 |
0.1246 | 2.7675 | 27000 | 0.3481 | 0.8889 | 0.5771 | 0.3902 | 0.4656 |
0.1269 | 2.8188 | 27500 | 0.3844 | 0.8899 | 0.5929 | 0.3569 | 0.4455 |
0.1447 | 2.8700 | 28000 | 0.3254 | 0.8931 | 0.6391 | 0.3161 | 0.4230 |
0.1354 | 2.9213 | 28500 | 0.3251 | 0.8892 | 0.5814 | 0.3806 | 0.4600 |
0.137 | 2.9725 | 29000 | 0.3319 | 0.8922 | 0.6215 | 0.3351 | 0.4354 |
0.1063 | 3.0238 | 29500 | 0.3667 | 0.8924 | 0.6221 | 0.3370 | 0.4372 |
0.1095 | 3.0750 | 30000 | 0.3773 | 0.8919 | 0.6185 | 0.3347 | 0.4344 |
0.1254 | 3.1263 | 30500 | 0.3353 | 0.8929 | 0.6310 | 0.3275 | 0.4312 |
0.1204 | 3.1775 | 31000 | 0.3372 | 0.8833 | 0.5359 | 0.4365 | 0.4811 |
0.1013 | 3.2288 | 31500 | 0.3612 | 0.8893 | 0.5896 | 0.3536 | 0.4421 |
0.124 | 3.2800 | 32000 | 0.3470 | 0.8877 | 0.5683 | 0.3937 | 0.4652 |
0.1123 | 3.3313 | 32500 | 0.3479 | 0.8879 | 0.5718 | 0.3836 | 0.4592 |
0.1402 | 3.3825 | 33000 | 0.3561 | 0.8909 | 0.6038 | 0.3503 | 0.4433 |
0.1323 | 3.4338 | 33500 | 0.3417 | 0.8908 | 0.6044 | 0.3453 | 0.4395 |
0.1124 | 3.4850 | 34000 | 0.3409 | 0.8910 | 0.6018 | 0.3573 | 0.4484 |
0.1108 | 3.5363 | 34500 | 0.3723 | 0.8930 | 0.6400 | 0.3131 | 0.4205 |
0.121 | 3.5875 | 35000 | 0.3461 | 0.8920 | 0.6104 | 0.3567 | 0.4503 |
0.1145 | 3.6388 | 35500 | 0.3612 | 0.8901 | 0.5941 | 0.3601 | 0.4484 |
0.1052 | 3.6900 | 36000 | 0.3812 | 0.8916 | 0.6086 | 0.3518 | 0.4459 |
0.1181 | 3.7413 | 36500 | 0.3448 | 0.8877 | 0.5683 | 0.3916 | 0.4637 |
0.114 | 3.7925 | 37000 | 0.3502 | 0.8917 | 0.6081 | 0.3569 | 0.4498 |
0.1106 | 3.8438 | 37500 | 0.3499 | 0.8939 | 0.6471 | 0.3174 | 0.4259 |
0.127 | 3.8950 | 38000 | 0.3441 | 0.8926 | 0.6245 | 0.3360 | 0.4369 |
0.116 | 3.9463 | 38500 | 0.3358 | 0.8923 | 0.6163 | 0.3488 | 0.4455 |
0.1188 | 3.9975 | 39000 | 0.3496 | 0.8929 | 0.6295 | 0.3311 | 0.4340 |
0.1042 | 4.0488 | 39500 | 0.3884 | 0.8933 | 0.6292 | 0.3403 | 0.4417 |
0.1248 | 4.1000 | 40000 | 0.3510 | 0.8907 | 0.5950 | 0.3723 | 0.4580 |
0.0918 | 4.1513 | 40500 | 0.3670 | 0.8922 | 0.6149 | 0.3498 | 0.4460 |
0.1193 | 4.2025 | 41000 | 0.3725 | 0.8923 | 0.6130 | 0.3559 | 0.4504 |
0.1104 | 4.2538 | 41500 | 0.3949 | 0.8930 | 0.6335 | 0.3244 | 0.4290 |
0.1096 | 4.3050 | 42000 | 0.3742 | 0.8929 | 0.6268 | 0.3360 | 0.4375 |
0.122 | 4.3563 | 42500 | 0.3601 | 0.8935 | 0.6406 | 0.3209 | 0.4276 |
0.1162 | 4.4075 | 43000 | 0.3617 | 0.8917 | 0.6161 | 0.3359 | 0.4347 |
0.1031 | 4.4588 | 43500 | 0.3752 | 0.8918 | 0.6150 | 0.3416 | 0.4392 |
0.123 | 4.5100 | 44000 | 0.3720 | 0.8930 | 0.6245 | 0.3442 | 0.4438 |
0.0985 | 4.5613 | 44500 | 0.3783 | 0.8932 | 0.6296 | 0.3359 | 0.4381 |
0.1204 | 4.6125 | 45000 | 0.3617 | 0.8911 | 0.5987 | 0.3682 | 0.4560 |
0.1164 | 4.6638 | 45500 | 0.3566 | 0.8899 | 0.5928 | 0.3581 | 0.4465 |
0.1112 | 4.7150 | 46000 | 0.3622 | 0.8917 | 0.6162 | 0.3367 | 0.4354 |
0.1077 | 4.7663 | 46500 | 0.3831 | 0.8935 | 0.6612 | 0.2889 | 0.4021 |
0.1117 | 4.8175 | 47000 | 0.3668 | 0.8921 | 0.6144 | 0.3481 | 0.4444 |
0.1079 | 4.8688 | 47500 | 0.3830 | 0.8922 | 0.6173 | 0.3443 | 0.4421 |
0.1068 | 4.9200 | 48000 | 0.3710 | 0.8918 | 0.6116 | 0.3488 | 0.4443 |
0.1056 | 4.9713 | 48500 | 0.3824 | 0.8932 | 0.6439 | 0.3111 | 0.4195 |
0.1141 | 5.0226 | 49000 | 0.3688 | 0.8925 | 0.6204 | 0.3427 | 0.4415 |
0.123 | 5.0738 | 49500 | 0.3849 | 0.8912 | 0.5993 | 0.3707 | 0.4581 |
0.0977 | 5.1251 | 50000 | 0.4041 | 0.8925 | 0.6206 | 0.3414 | 0.4405 |
0.1091 | 5.1763 | 50500 | 0.3838 | 0.8901 | 0.5889 | 0.3755 | 0.4586 |
0.1082 | 5.2276 | 51000 | 0.3753 | 0.8914 | 0.6002 | 0.3717 | 0.4591 |
0.1119 | 5.2788 | 51500 | 0.3746 | 0.8919 | 0.6188 | 0.3348 | 0.4345 |
0.1206 | 5.3301 | 52000 | 0.3770 | 0.8913 | 0.6048 | 0.3562 | 0.4484 |
0.1136 | 5.3813 | 52500 | 0.3863 | 0.8905 | 0.5937 | 0.3709 | 0.4566 |
0.108 | 5.4326 | 53000 | 0.3671 | 0.8901 | 0.5889 | 0.3765 | 0.4594 |
0.1143 | 5.4838 | 53500 | 0.3798 | 0.8923 | 0.6178 | 0.3448 | 0.4426 |
0.1125 | 5.5351 | 54000 | 0.3828 | 0.8922 | 0.6169 | 0.3454 | 0.4428 |
0.1119 | 5.5863 | 54500 | 0.3743 | 0.8919 | 0.6129 | 0.3473 | 0.4434 |
0.1132 | 5.6376 | 55000 | 0.3621 | 0.8909 | 0.5989 | 0.3646 | 0.4533 |
0.1054 | 5.6888 | 55500 | 0.3651 | 0.8916 | 0.6091 | 0.3519 | 0.4461 |
0.0993 | 5.7401 | 56000 | 0.3721 | 0.8911 | 0.5999 | 0.3651 | 0.4539 |
0.0904 | 5.7913 | 56500 | 0.3825 | 0.8915 | 0.6065 | 0.3549 | 0.4478 |
0.1278 | 5.8426 | 57000 | 0.3801 | 0.8921 | 0.6148 | 0.3472 | 0.4438 |
0.0983 | 5.8938 | 57500 | 0.3813 | 0.8919 | 0.6122 | 0.3501 | 0.4455 |
0.0945 | 5.9451 | 58000 | 0.3819 | 0.8919 | 0.6119 | 0.3504 | 0.4456 |
0.0956 | 5.9963 | 58500 | 0.3767 | 0.8918 | 0.6097 | 0.3550 | 0.4487 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 8
Model tree for egerber1/classifier-de1_roberta
Base model
FacebookAI/xlm-roberta-base