adv_fin_run_bs16_lr5e-06_layers4_reg0

This model is a fine-tuned version of Zamza/XLM-roberta-large-ftit-emb-lr01 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 7.0198
  • Precision Type: 0.8724
  • Recall Type: 0.6914
  • F1 Type: 0.7575
  • Accuracy Type: 0.6914
  • Precision Class: 0.8516
  • Recall Class: 0.4625
  • F1 Class: 0.5730
  • Accuracy Class: 0.4625
  • Precision Rel: 0.9618
  • Recall Rel: 0.6313
  • F1 Rel: 0.7527
  • Accuracy Rel: 0.6313

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 25
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Type Recall Type F1 Type Accuracy Type Precision Class Recall Class F1 Class Accuracy Class Precision Rel Recall Rel F1 Rel Accuracy Rel
8.8082 0.2181 1000 8.6997 0.8216 0.6053 0.6776 0.6053 0.7660 0.1714 0.2138 0.1714 0.9568 0.2859 0.4166 0.2859
7.9664 0.4362 2000 7.6315 0.8549 0.6343 0.7106 0.6343 0.8016 0.3632 0.4643 0.3632 0.9599 0.5289 0.6696 0.5289
7.1492 0.6543 3000 6.9926 0.8574 0.6551 0.7261 0.6551 0.8202 0.4153 0.5161 0.4153 0.9617 0.5216 0.6643 0.5216
6.9614 0.8724 4000 6.6153 0.8593 0.6896 0.7524 0.6896 0.8304 0.4097 0.5144 0.4097 0.9627 0.5887 0.7214 0.5887
6.2 1.0905 5000 6.3489 0.8635 0.6307 0.7092 0.6307 0.8334 0.3631 0.4671 0.3631 0.9636 0.4813 0.6294 0.4813
5.8458 1.3086 6000 6.2081 0.8692 0.6227 0.7058 0.6227 0.8329 0.3746 0.4800 0.3746 0.9625 0.5669 0.7022 0.5669
5.7155 1.5267 7000 6.0427 0.8698 0.6668 0.7392 0.6668 0.8368 0.4621 0.5688 0.4621 0.9619 0.5891 0.7188 0.5891
5.999 1.7448 8000 5.9186 0.8749 0.6703 0.7454 0.6703 0.8394 0.4301 0.5397 0.4301 0.9632 0.5849 0.7181 0.5849
6.1301 1.9629 9000 5.8386 0.8699 0.6739 0.7444 0.6739 0.8421 0.3848 0.4939 0.3848 0.9626 0.5569 0.6931 0.5569
5.4109 2.1810 10000 5.7561 0.8741 0.6813 0.7521 0.6813 0.8416 0.4319 0.5414 0.4319 0.9637 0.5428 0.6828 0.5428
5.4101 2.3991 11000 5.6962 0.8791 0.6397 0.7243 0.6397 0.8457 0.3969 0.5064 0.3969 0.9640 0.4977 0.6434 0.4977
5.2936 2.6172 12000 5.6628 0.8726 0.6994 0.7638 0.6994 0.8466 0.4269 0.5382 0.4269 0.9642 0.5911 0.7231 0.5911
5.2238 2.8353 13000 5.6280 0.8790 0.6619 0.7403 0.6619 0.8461 0.4399 0.5506 0.4399 0.9634 0.5808 0.7146 0.5808
5.2957 3.0534 14000 5.6183 0.8754 0.6715 0.7459 0.6715 0.8479 0.4006 0.5118 0.4006 0.9637 0.5739 0.7087 0.5739
5.2465 3.2715 15000 5.5930 0.8681 0.6973 0.7595 0.6973 0.8451 0.4361 0.5476 0.4361 0.9648 0.5700 0.7071 0.5700
5.3637 3.4896 16000 5.5609 0.8718 0.6839 0.7515 0.6839 0.8505 0.4241 0.5369 0.4241 0.9653 0.5558 0.6965 0.5558
4.8713 3.7077 17000 5.5274 0.8713 0.6878 0.7533 0.6878 0.8455 0.4354 0.5466 0.4354 0.9631 0.5680 0.7031 0.5680
5.4422 3.9258 18000 5.5055 0.8772 0.6724 0.7469 0.6724 0.8486 0.4114 0.5232 0.4114 0.9634 0.5340 0.6750 0.5340
4.8576 4.1439 19000 5.5128 0.8754 0.6835 0.7545 0.6835 0.8486 0.4269 0.5379 0.4269 0.9634 0.5773 0.7118 0.5773
4.9733 4.3621 20000 5.5083 0.8785 0.6921 0.7617 0.6921 0.8493 0.4414 0.5536 0.4414 0.9637 0.5471 0.6872 0.5471
5.2369 4.5802 21000 5.4945 0.8788 0.6655 0.7419 0.6655 0.8500 0.4325 0.5423 0.4325 0.9632 0.5599 0.6973 0.5599
4.9606 4.7983 22000 5.4844 0.8761 0.6840 0.7547 0.6840 0.8512 0.4290 0.5415 0.4290 0.9652 0.5418 0.6841 0.5418
4.6779 5.0164 23000 5.5106 0.8772 0.6579 0.7354 0.6579 0.8477 0.4135 0.5250 0.4135 0.9646 0.5246 0.6688 0.5246
4.7326 5.2345 24000 5.4984 0.8744 0.6873 0.7561 0.6873 0.8505 0.4363 0.5476 0.4363 0.9636 0.5922 0.7237 0.5922
4.9567 5.4526 25000 5.5120 0.8777 0.6926 0.7615 0.6926 0.8475 0.4445 0.5536 0.4445 0.9641 0.6002 0.7301 0.6002
4.6118 5.6707 26000 5.5006 0.8725 0.6933 0.7585 0.6933 0.8497 0.4248 0.5356 0.4248 0.9644 0.5786 0.7145 0.5786
4.7405 5.8888 27000 5.4864 0.8763 0.6817 0.7528 0.6817 0.8477 0.4325 0.5443 0.4325 0.9635 0.5606 0.6981 0.5606
4.4146 6.1069 28000 5.5289 0.8775 0.6627 0.7393 0.6627 0.8496 0.4516 0.5632 0.4516 0.9632 0.5915 0.7224 0.5915
4.215 6.3250 29000 5.4930 0.8777 0.6716 0.7465 0.6716 0.8488 0.4454 0.5573 0.4454 0.9631 0.5862 0.7191 0.5862
4.5322 6.5431 30000 5.5396 0.8774 0.6832 0.7544 0.6832 0.8506 0.4505 0.5635 0.4505 0.9638 0.5915 0.7238 0.5915
4.4777 6.7612 31000 5.5106 0.8775 0.6835 0.7550 0.6835 0.8507 0.4270 0.5408 0.4270 0.9644 0.5553 0.6945 0.5553
4.6033 6.9793 32000 5.5255 0.8783 0.6917 0.7608 0.6917 0.8516 0.4407 0.5540 0.4407 0.9636 0.5820 0.7156 0.5820
4.4052 7.1974 33000 5.5653 0.8783 0.6830 0.7551 0.6830 0.8480 0.4535 0.5636 0.4535 0.9646 0.5724 0.7090 0.5724
4.3635 7.4155 34000 5.6034 0.8776 0.6947 0.7627 0.6947 0.8487 0.4501 0.5624 0.4501 0.9638 0.5915 0.7234 0.5915
4.22 7.6336 35000 5.5979 0.8756 0.6946 0.7618 0.6946 0.8509 0.4384 0.5501 0.4384 0.9637 0.5973 0.7287 0.5973
4.5977 7.8517 36000 5.5612 0.8776 0.6732 0.7473 0.6732 0.8508 0.4493 0.5606 0.4493 0.9636 0.5804 0.7140 0.5804
3.7723 8.0698 37000 5.6254 0.8747 0.6856 0.7542 0.6856 0.8505 0.4477 0.5601 0.4477 0.9640 0.5704 0.7067 0.5704
4.146 8.2879 38000 5.6130 0.8751 0.6890 0.7573 0.6890 0.8498 0.4583 0.5698 0.4583 0.9642 0.5847 0.7192 0.5847
4.1413 8.5060 39000 5.6423 0.8775 0.6749 0.7486 0.6749 0.8501 0.4469 0.5584 0.4469 0.9641 0.6015 0.7314 0.6015
4.0498 8.7241 40000 5.6417 0.8786 0.6736 0.7484 0.6736 0.8503 0.4516 0.5629 0.4516 0.9644 0.5730 0.7094 0.5730
4.0145 8.9422 41000 5.6702 0.8783 0.6770 0.7498 0.6770 0.8513 0.4337 0.5459 0.4337 0.9639 0.5475 0.6875 0.5475
4.1612 9.1603 42000 5.6969 0.8778 0.6729 0.7472 0.6729 0.8495 0.4266 0.5381 0.4266 0.9642 0.5741 0.7108 0.5741
3.8686 9.3784 43000 5.6880 0.8766 0.6986 0.7649 0.6986 0.8511 0.4643 0.5750 0.4643 0.9629 0.6108 0.7378 0.6108
4.0507 9.5965 44000 5.7104 0.8767 0.6893 0.7584 0.6893 0.8517 0.4449 0.5574 0.4449 0.9634 0.5767 0.7113 0.5767
3.9494 9.8146 45000 5.7410 0.8730 0.7032 0.7658 0.7032 0.8528 0.4436 0.5546 0.4436 0.9629 0.6167 0.7425 0.6167
3.5159 10.0327 46000 5.7842 0.8732 0.6913 0.7584 0.6913 0.8508 0.4408 0.5520 0.4408 0.9636 0.5682 0.7044 0.5682
3.824 10.2508 47000 5.7613 0.8757 0.6830 0.7535 0.6830 0.8508 0.4718 0.5803 0.4718 0.9633 0.6105 0.7384 0.6105
3.9568 10.4689 48000 5.7965 0.8814 0.6696 0.7473 0.6696 0.8521 0.4544 0.5655 0.4544 0.9643 0.5850 0.7199 0.5850
4.0009 10.6870 49000 5.8102 0.8749 0.7054 0.7689 0.7054 0.8499 0.4459 0.5578 0.4459 0.9626 0.6197 0.7449 0.6197
3.9041 10.9051 50000 5.8216 0.8772 0.6723 0.7466 0.6723 0.8509 0.4327 0.5444 0.4327 0.9636 0.5659 0.7025 0.5659
3.4441 11.1232 51000 5.8319 0.8792 0.6731 0.7484 0.6731 0.8513 0.4240 0.5347 0.4240 0.9640 0.5709 0.7075 0.5709
3.6306 11.3413 52000 5.8787 0.8754 0.7004 0.7659 0.7004 0.8522 0.4577 0.5690 0.4577 0.9628 0.5937 0.7243 0.5937
3.4473 11.5594 53000 5.9255 0.8770 0.6856 0.7561 0.6856 0.8503 0.4424 0.5534 0.4424 0.9642 0.5847 0.7190 0.5847
3.5747 11.7775 54000 5.9173 0.8748 0.6842 0.7538 0.6842 0.8516 0.4401 0.5510 0.4401 0.9624 0.6076 0.7352 0.6076
3.9448 11.9956 55000 5.9105 0.8759 0.6948 0.7616 0.6948 0.8524 0.4621 0.5725 0.4621 0.9624 0.6115 0.7379 0.6115
3.4954 12.2137 56000 6.0364 0.8745 0.7042 0.7681 0.7042 0.8515 0.4508 0.5622 0.4508 0.9631 0.6142 0.7412 0.6142
3.5057 12.4318 57000 5.9604 0.8752 0.6972 0.7636 0.6972 0.8521 0.4571 0.5683 0.4571 0.9635 0.5976 0.7290 0.5976
3.268 12.6499 58000 6.0932 0.8749 0.6985 0.7649 0.6985 0.8515 0.4767 0.5862 0.4767 0.9631 0.6215 0.7464 0.6215
3.1336 12.8680 59000 6.0546 0.8764 0.6784 0.7506 0.6784 0.8526 0.4578 0.5679 0.4578 0.9630 0.5954 0.7265 0.5954
3.3501 13.0862 60000 6.0474 0.8765 0.6685 0.7439 0.6685 0.8508 0.4455 0.5574 0.4455 0.9628 0.5888 0.7208 0.5888
3.3062 13.3043 61000 6.0900 0.8772 0.6760 0.7498 0.6760 0.8519 0.4451 0.5566 0.4451 0.9623 0.6128 0.7397 0.6128
3.0751 13.5224 62000 6.1283 0.8762 0.6838 0.7544 0.6838 0.8498 0.4517 0.5624 0.4517 0.9631 0.6043 0.7332 0.6043
3.2603 13.7405 63000 6.1809 0.8749 0.6928 0.7606 0.6928 0.8506 0.4514 0.5625 0.4514 0.9624 0.6152 0.7406 0.6152
3.4362 13.9586 64000 6.1157 0.8772 0.6904 0.7604 0.6904 0.8524 0.4584 0.5698 0.4584 0.9627 0.6149 0.7414 0.6149
3.0973 14.1767 65000 6.1833 0.8760 0.6887 0.7575 0.6887 0.8513 0.4522 0.5631 0.4522 0.9628 0.6055 0.7338 0.6055
3.0773 14.3948 66000 6.2383 0.8752 0.6925 0.7599 0.6925 0.8515 0.4598 0.5705 0.4598 0.9621 0.6186 0.7435 0.6186
3.2577 14.6129 67000 6.2329 0.8783 0.6695 0.7452 0.6695 0.8520 0.4534 0.5647 0.4534 0.9624 0.5954 0.7259 0.5954
3.1827 14.8310 68000 6.2409 0.8769 0.6729 0.7463 0.6729 0.8522 0.4388 0.5517 0.4388 0.9624 0.5872 0.7188 0.5872
2.9558 15.0491 69000 6.3070 0.8752 0.6780 0.7494 0.6780 0.8505 0.4571 0.5682 0.4571 0.9630 0.5995 0.7296 0.5995
2.8572 15.2672 70000 6.3755 0.8749 0.6876 0.7558 0.6876 0.8498 0.4667 0.5763 0.4667 0.9624 0.6387 0.7592 0.6387
3.1326 15.4853 71000 6.3827 0.8724 0.7040 0.7665 0.7040 0.8508 0.4621 0.5731 0.4621 0.9628 0.6186 0.7441 0.6186
3.2578 15.7034 72000 6.3964 0.8740 0.6948 0.7610 0.6948 0.8518 0.4565 0.5677 0.4565 0.9631 0.6069 0.7351 0.6069
3.0824 15.9215 73000 6.4341 0.8743 0.6914 0.7582 0.6914 0.8520 0.4486 0.5602 0.4486 0.9621 0.6095 0.7360 0.6095
2.9715 16.1396 74000 6.4227 0.8721 0.6963 0.7601 0.6963 0.8519 0.4458 0.5573 0.4458 0.9624 0.6108 0.7375 0.6108
3.0745 16.3577 75000 6.4038 0.8730 0.6867 0.7543 0.6867 0.8509 0.4499 0.5601 0.4499 0.9625 0.6076 0.7357 0.6076
3.1964 16.5758 76000 6.4467 0.8747 0.6891 0.7573 0.6891 0.8515 0.4574 0.5680 0.4574 0.9631 0.6004 0.7306 0.6004
2.9492 16.7939 77000 6.4520 0.8736 0.6770 0.7477 0.6770 0.8518 0.4426 0.5547 0.4426 0.9632 0.6151 0.7420 0.6151
3.0828 17.0120 78000 6.4817 0.8765 0.6787 0.7508 0.6787 0.8507 0.4608 0.5706 0.4608 0.9627 0.6085 0.7358 0.6085
2.8371 17.2301 79000 6.4276 0.8742 0.6875 0.7559 0.6875 0.8527 0.4545 0.5655 0.4545 0.9626 0.6084 0.7363 0.6084
2.8705 17.4482 80000 6.5054 0.8740 0.6703 0.7434 0.6703 0.8524 0.4452 0.5574 0.4452 0.9626 0.6004 0.7298 0.6004
2.9127 17.6663 81000 6.5495 0.8748 0.6889 0.7572 0.6889 0.8516 0.4617 0.5721 0.4617 0.9626 0.6087 0.7358 0.6087
2.9532 17.8844 82000 6.6116 0.8739 0.6817 0.7510 0.6817 0.8501 0.4478 0.5578 0.4478 0.9623 0.6161 0.7414 0.6161
2.8697 18.1025 83000 6.5947 0.8740 0.6921 0.7589 0.6921 0.8523 0.4480 0.5593 0.4480 0.9623 0.6164 0.7421 0.6164
3.1279 18.3206 84000 6.6795 0.8742 0.6804 0.7506 0.6804 0.8512 0.4497 0.5611 0.4497 0.9623 0.6203 0.7447 0.6203
2.9035 18.5387 85000 6.6608 0.8739 0.6968 0.7625 0.6968 0.8515 0.4511 0.5625 0.4511 0.9618 0.6420 0.7608 0.6420
2.9629 18.7568 86000 6.6067 0.8756 0.6770 0.7487 0.6770 0.8523 0.4539 0.5654 0.4539 0.9627 0.6041 0.7330 0.6041
2.9103 18.9749 87000 6.6638 0.8757 0.6790 0.7502 0.6790 0.8512 0.4601 0.5706 0.4601 0.9621 0.6164 0.7415 0.6164
2.7208 19.1930 88000 6.7028 0.8713 0.6977 0.7610 0.6977 0.8512 0.4526 0.5634 0.4526 0.9621 0.6215 0.7454 0.6215
2.756 19.4111 89000 6.7264 0.8736 0.6828 0.7521 0.6828 0.8522 0.4590 0.5696 0.4590 0.9622 0.6204 0.7450 0.6204
2.7783 19.6292 90000 6.7362 0.8731 0.6868 0.7545 0.6868 0.8512 0.4510 0.5615 0.4510 0.9619 0.6275 0.7500 0.6275
2.7609 19.8473 91000 6.8124 0.8726 0.6912 0.7573 0.6912 0.8512 0.4647 0.5752 0.4647 0.9621 0.6191 0.7435 0.6191
2.7303 20.0654 92000 6.8258 0.8718 0.6944 0.7593 0.6944 0.8505 0.4662 0.5755 0.4662 0.9614 0.6419 0.7602 0.6419
2.5858 20.2835 93000 6.7881 0.8733 0.6944 0.7602 0.6944 0.8516 0.4609 0.5714 0.4609 0.9620 0.6337 0.7549 0.6337
2.7269 20.5016 94000 6.8458 0.8737 0.6778 0.7487 0.6778 0.8521 0.4512 0.5622 0.4512 0.9619 0.6260 0.7486 0.6260
2.8068 20.7197 95000 6.8521 0.8729 0.6907 0.7572 0.6907 0.8530 0.4646 0.5753 0.4646 0.9619 0.6347 0.7554 0.6347
2.7931 20.9378 96000 6.7579 0.8762 0.6753 0.7484 0.6753 0.8523 0.4584 0.5689 0.4584 0.9619 0.6218 0.7460 0.6218
2.5692 21.1559 97000 6.8614 0.8730 0.6899 0.7566 0.6899 0.8518 0.4547 0.5660 0.4547 0.9618 0.6271 0.7497 0.6271
2.7846 21.3740 98000 6.8749 0.8752 0.6824 0.7527 0.6824 0.8521 0.4563 0.5673 0.4563 0.9619 0.6251 0.7480 0.6251
2.59 21.5921 99000 6.9252 0.8721 0.6919 0.7574 0.6919 0.8518 0.4665 0.5765 0.4665 0.9620 0.6297 0.7516 0.6297
2.7931 21.8103 100000 6.8448 0.8731 0.6906 0.7573 0.6906 0.8520 0.4610 0.5719 0.4610 0.9616 0.6328 0.7543 0.6328
2.7317 22.0284 101000 6.8958 0.8718 0.6927 0.7581 0.6927 0.8514 0.4628 0.5730 0.4628 0.9617 0.6270 0.7494 0.6270
2.7251 22.2465 102000 6.9417 0.8742 0.6863 0.7555 0.6863 0.8518 0.4714 0.5812 0.4714 0.9614 0.6487 0.7653 0.6487
2.4123 22.4646 103000 6.9228 0.8716 0.6893 0.7557 0.6893 0.8506 0.4570 0.5676 0.4570 0.9618 0.6360 0.7567 0.6360
2.6742 22.6827 104000 6.9557 0.8717 0.6939 0.7587 0.6939 0.8515 0.4612 0.5718 0.4612 0.9619 0.6303 0.7523 0.6303
2.6853 22.9008 105000 6.9950 0.8719 0.6946 0.7594 0.6946 0.8511 0.4610 0.5717 0.4610 0.9616 0.6383 0.7578 0.6383
2.49 23.1189 106000 6.9618 0.8714 0.6949 0.7593 0.6949 0.8511 0.4599 0.5702 0.4599 0.9621 0.6314 0.7532 0.6314
2.5727 23.3370 107000 7.0083 0.8712 0.6932 0.7579 0.6932 0.8517 0.4592 0.5699 0.4592 0.9618 0.6286 0.7506 0.6286
2.6205 23.5551 108000 6.9858 0.8735 0.6886 0.7562 0.6886 0.8511 0.4619 0.5720 0.4619 0.9619 0.6309 0.7524 0.6309
2.5545 23.7732 109000 6.9862 0.8726 0.6890 0.7560 0.6890 0.8516 0.4590 0.5695 0.4590 0.9617 0.6347 0.7553 0.6347
2.6459 23.9913 110000 6.9527 0.8732 0.6868 0.7548 0.6868 0.8517 0.4564 0.5672 0.4564 0.9619 0.6281 0.7507 0.6281
2.4218 24.2094 111000 7.0087 0.8726 0.6912 0.7576 0.6912 0.8516 0.4614 0.5719 0.4614 0.9617 0.6386 0.7583 0.6386
2.3747 24.4275 112000 7.0212 0.8722 0.6899 0.7563 0.6899 0.8513 0.4624 0.5727 0.4624 0.9619 0.6295 0.7515 0.6295
2.5904 24.6456 113000 7.0198 0.8727 0.6899 0.7566 0.6899 0.8515 0.4618 0.5723 0.4618 0.9618 0.6316 0.7530 0.6316
2.5283 24.8637 114000 7.0198 0.8724 0.6914 0.7575 0.6914 0.8516 0.4625 0.5730 0.4625 0.9618 0.6313 0.7527 0.6313

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
6
Safetensors
Model size
576M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Zamza/adv_fin_run_bs16_lr5e-06_layers4_reg0

Finetuned
(3)
this model