mdeberta-v3-base_regression_5_seed69_NL-IT
This model is a fine-tuned version of microsoft/mdeberta-v3-base on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8107
- Mse: 3.0228
- Rmse: 1.7386
- Mae: 1.1019
- R2: 0.0618
- F1: 0.7313
- Precision: 0.7306
- Recall: 0.7375
- Accuracy: 0.7375
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 10
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Mse |
Rmse |
Mae |
R2 |
F1 |
Precision |
Recall |
Accuracy |
1.2686 |
0.2105 |
100 |
1.0348 |
3.5486 |
1.8838 |
1.4950 |
-0.1470 |
0.5333 |
0.4444 |
0.6667 |
0.6667 |
1.0527 |
0.4211 |
200 |
1.0271 |
3.3385 |
1.8272 |
1.4928 |
-0.0791 |
0.5333 |
0.4444 |
0.6667 |
0.6667 |
1.0554 |
0.6316 |
300 |
0.9925 |
3.0827 |
1.7557 |
1.4434 |
0.0036 |
0.5333 |
0.4444 |
0.6667 |
0.6667 |
0.9855 |
0.8421 |
400 |
0.9574 |
3.2051 |
1.7903 |
1.3379 |
-0.0360 |
0.5333 |
0.4444 |
0.6667 |
0.6667 |
0.9683 |
1.0526 |
500 |
0.9145 |
2.9293 |
1.7115 |
1.3043 |
0.0532 |
0.5333 |
0.4444 |
0.6667 |
0.6667 |
0.9035 |
1.2632 |
600 |
0.9039 |
2.9731 |
1.7243 |
1.2658 |
0.0390 |
0.5333 |
0.4444 |
0.6667 |
0.6667 |
0.9152 |
1.4737 |
700 |
0.8716 |
2.8287 |
1.6819 |
1.2489 |
0.0857 |
0.5961 |
0.6692 |
0.6821 |
0.6821 |
0.8686 |
1.6842 |
800 |
0.8942 |
2.9308 |
1.7120 |
1.2728 |
0.0527 |
0.6840 |
0.6807 |
0.6904 |
0.6904 |
0.8318 |
1.8947 |
900 |
0.8755 |
2.8950 |
1.7015 |
1.2539 |
0.0643 |
0.6879 |
0.6851 |
0.6987 |
0.6987 |
0.7712 |
2.1053 |
1000 |
0.8290 |
2.8370 |
1.6843 |
1.1719 |
0.0830 |
0.6900 |
0.6982 |
0.7141 |
0.7141 |
0.7223 |
2.3158 |
1100 |
0.8509 |
2.9955 |
1.7308 |
1.1748 |
0.0318 |
0.6965 |
0.6953 |
0.7094 |
0.7094 |
0.673 |
2.5263 |
1200 |
0.8089 |
2.8114 |
1.6767 |
1.1317 |
0.0913 |
0.7194 |
0.7186 |
0.7295 |
0.7295 |
0.7588 |
2.7368 |
1300 |
0.7937 |
2.7320 |
1.6529 |
1.1221 |
0.1170 |
0.7088 |
0.7141 |
0.7272 |
0.7272 |
0.7272 |
2.9474 |
1400 |
0.7876 |
2.8003 |
1.6734 |
1.0889 |
0.0949 |
0.7127 |
0.7236 |
0.7343 |
0.7343 |
0.7067 |
3.1579 |
1500 |
0.7953 |
2.8252 |
1.6808 |
1.1103 |
0.0868 |
0.7235 |
0.7255 |
0.7367 |
0.7367 |
0.6188 |
3.3684 |
1600 |
0.8261 |
3.0567 |
1.7483 |
1.1120 |
0.0120 |
0.6868 |
0.7144 |
0.7224 |
0.7224 |
0.6739 |
3.5789 |
1700 |
0.7795 |
2.8149 |
1.6778 |
1.0859 |
0.0902 |
0.7074 |
0.7195 |
0.7307 |
0.7307 |
0.6268 |
3.7895 |
1800 |
0.7979 |
2.8785 |
1.6966 |
1.0905 |
0.0696 |
0.7009 |
0.7159 |
0.7272 |
0.7272 |
0.6153 |
4.0 |
1900 |
0.8107 |
2.9016 |
1.7034 |
1.1205 |
0.0621 |
0.7188 |
0.7176 |
0.7284 |
0.7284 |
0.5826 |
4.2105 |
2000 |
0.7920 |
2.8243 |
1.6806 |
1.0940 |
0.0871 |
0.7423 |
0.7404 |
0.7461 |
0.7461 |
0.5995 |
4.4211 |
2100 |
0.7702 |
2.6708 |
1.6343 |
1.0843 |
0.1367 |
0.7403 |
0.7385 |
0.7450 |
0.7450 |
0.5991 |
4.6316 |
2200 |
0.7650 |
2.6919 |
1.6407 |
1.0568 |
0.1299 |
0.7403 |
0.7430 |
0.7521 |
0.7521 |
0.5429 |
4.8421 |
2300 |
0.7610 |
2.7230 |
1.6502 |
1.0471 |
0.1198 |
0.7250 |
0.7346 |
0.7438 |
0.7438 |
0.5632 |
5.0526 |
2400 |
0.7815 |
2.7910 |
1.6706 |
1.0859 |
0.0979 |
0.7281 |
0.7308 |
0.7414 |
0.7414 |
0.5153 |
5.2632 |
2500 |
0.7490 |
2.7125 |
1.6470 |
1.0339 |
0.1232 |
0.7525 |
0.7536 |
0.7616 |
0.7616 |
0.5034 |
5.4737 |
2600 |
0.7857 |
2.9123 |
1.7065 |
1.0660 |
0.0587 |
0.7479 |
0.7466 |
0.7497 |
0.7497 |
0.5119 |
5.6842 |
2700 |
0.7614 |
2.7580 |
1.6607 |
1.0481 |
0.1085 |
0.7463 |
0.7449 |
0.7485 |
0.7485 |
0.511 |
5.8947 |
2800 |
0.7288 |
2.6286 |
1.6213 |
1.0034 |
0.1504 |
0.7498 |
0.7554 |
0.7628 |
0.7628 |
0.5119 |
6.1053 |
2900 |
0.7609 |
2.7530 |
1.6592 |
1.0489 |
0.1101 |
0.7394 |
0.7405 |
0.7497 |
0.7497 |
0.4733 |
6.3158 |
3000 |
0.7794 |
2.8620 |
1.6917 |
1.0572 |
0.0749 |
0.7252 |
0.7258 |
0.7367 |
0.7367 |
0.4909 |
6.5263 |
3100 |
0.7687 |
2.7897 |
1.6702 |
1.0660 |
0.0983 |
0.7570 |
0.7554 |
0.7604 |
0.7604 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.19.1