histv4_ftis_noPretrain_0329

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 30.3470
  • Accuracy: 0.9654
  • Macro F1: 0.9172

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 6732
  • training_steps: 134650

Training results

Training Loss Epoch Step Validation Loss Accuracy Macro F1
14.0944 0.0025 336 104.9989 0.1840 0.0644
5.7051 1.0025 672 167.1822 0.5312 0.1381
4.0681 2.0025 1008 109.6770 0.5846 0.1548
3.2136 3.0025 1344 69.4755 0.6129 0.1768
2.477 4.0025 1680 32.8744 0.6320 0.2055
2.2543 5.0025 2016 21.8861 0.6553 0.2477
2.1162 6.0025 2352 17.5059 0.6736 0.2926
1.9294 7.0024 2688 12.7943 0.6936 0.3476
1.7024 8.0024 3024 9.5772 0.7159 0.3861
1.6392 9.0024 3360 7.7979 0.7679 0.4240
1.4076 10.0024 3696 6.5200 0.7873 0.4719
1.2468 11.0024 4032 6.3598 0.7791 0.4443
1.2523 12.0024 4368 7.2844 0.8128 0.5311
1.1081 13.0024 4704 5.6548 0.8271 0.5559
1.06 14.0024 5040 7.1206 0.8342 0.5617
0.9138 15.0024 5376 7.9868 0.8431 0.5930
0.822 16.0024 5712 8.0841 0.8472 0.6184
0.8428 17.0024 6048 9.9737 0.8587 0.6244
0.7317 18.0024 6384 8.1017 0.8613 0.6364
0.6619 19.0024 6720 10.9551 0.8671 0.6450
0.7468 20.0023 7056 16.1421 0.8672 0.6603
0.6155 21.0023 7392 13.5708 0.8800 0.6918
0.6229 22.0023 7728 12.3995 0.8915 0.7134
0.5425 23.0023 8064 17.1976 0.9000 0.7229
0.4771 24.0023 8400 13.2375 0.9023 0.7330
0.4006 25.0023 8736 13.6631 0.9059 0.7446
0.4207 26.0023 9072 14.1532 0.9140 0.7575
0.3106 27.0023 9408 14.6463 0.9196 0.7746
0.2946 28.0023 9744 10.9629 0.9236 0.7807
0.2848 29.0023 10080 12.8601 0.9217 0.7814
0.2637 30.0023 10416 17.2428 0.9275 0.7939
0.2214 31.0023 10752 17.8229 0.9302 0.7994
0.2044 32.0023 11088 15.7782 0.9326 0.8107
0.2472 33.0023 11424 13.8583 0.9339 0.8089
0.1953 34.0022 11760 11.8408 0.9366 0.8119
0.1596 35.0022 12096 17.0323 0.9333 0.8131
0.1708 36.0022 12432 11.7012 0.9413 0.8233
0.1512 37.0022 12768 12.2405 0.9432 0.8258
0.1209 38.0022 13104 13.0224 0.9452 0.8396
0.1107 39.0022 13440 12.4918 0.9422 0.8414
0.1011 40.0022 13776 10.8608 0.9426 0.8406
0.1117 41.0022 14112 9.3011 0.9451 0.8462
0.0934 42.0022 14448 10.3171 0.9476 0.8529
0.0869 43.0022 14784 9.0230 0.9463 0.8515
0.0994 44.0022 15120 8.0696 0.9504 0.8567
0.079 45.0022 15456 9.1054 0.9476 0.8507
0.0794 46.0022 15792 7.2162 0.9468 0.8566
0.0729 47.0021 16128 6.8036 0.9516 0.8657
0.0741 48.0021 16464 8.4440 0.9544 0.8676
0.065 49.0021 16800 7.4610 0.9525 0.8647
0.0698 50.0021 17136 6.9018 0.9523 0.8638
0.0628 51.0021 17472 8.6704 0.9542 0.8703
0.0593 52.0021 17808 7.8352 0.9547 0.8726
0.0642 53.0021 18144 9.3772 0.9560 0.8771
0.0583 54.0021 18480 7.2395 0.9550 0.8719
0.0594 55.0021 18816 7.6058 0.9516 0.8750
0.0477 56.0021 19152 7.3676 0.9550 0.8785
0.0531 57.0021 19488 7.6097 0.9542 0.8739
0.048 58.0021 19824 8.3058 0.9561 0.8782
0.0523 59.0021 20160 7.9110 0.9579 0.8784
0.0442 60.0020 20496 8.4492 0.9549 0.8799
0.0386 61.0020 20832 8.5259 0.9530 0.8759
0.0443 62.0020 21168 7.9769 0.9573 0.8766
0.04 63.0020 21504 8.9177 0.9579 0.8821
0.0425 64.0020 21840 8.5140 0.9585 0.8843
0.0328 65.0020 22176 7.7798 0.9589 0.8854
0.0344 66.0020 22512 6.8881 0.9568 0.8806
0.0406 67.0020 22848 7.8357 0.9564 0.8831
0.0323 68.0020 23184 7.9853 0.9587 0.8924
0.0335 69.0020 23520 9.8914 0.9556 0.8895
0.0315 70.0020 23856 7.8691 0.9601 0.8912
0.0336 71.0020 24192 9.8346 0.9601 0.8914
0.029 72.0020 24528 8.3280 0.9592 0.8937
0.0329 73.0020 24864 7.5500 0.9586 0.8899
0.0334 74.0019 25200 6.9437 0.9569 0.8888
0.0294 75.0019 25536 7.7684 0.9603 0.8872
0.0293 76.0019 25872 9.4957 0.9619 0.8946
0.0285 77.0019 26208 7.8132 0.9597 0.8946
0.0233 78.0019 26544 8.4906 0.9614 0.8952
0.0223 79.0019 26880 7.9184 0.9604 0.8924
0.023 80.0019 27216 8.0255 0.9591 0.8951
0.0244 81.0019 27552 9.1397 0.9616 0.8965
0.0282 82.0019 27888 10.0710 0.9618 0.8991
0.0279 83.0019 28224 7.2937 0.9613 0.8987
0.0231 84.0019 28560 8.6195 0.9598 0.8966
0.0225 85.0019 28896 8.0029 0.9598 0.8986
0.0271 86.0019 29232 8.9803 0.9599 0.8953
0.0261 87.0018 29568 8.3564 0.9587 0.8962
0.0213 88.0018 29904 8.7615 0.9597 0.8972
0.0189 89.0018 30240 8.9518 0.9610 0.9010
0.019 90.0018 30576 8.8315 0.9592 0.8982
0.0207 91.0018 30912 7.7779 0.9598 0.8956
0.0196 92.0018 31248 10.8937 0.9604 0.8956
0.0184 93.0018 31584 10.6790 0.9630 0.8975
0.0184 94.0018 31920 8.5520 0.9600 0.8988
0.0213 95.0018 32256 10.4591 0.9613 0.8991
0.0162 96.0018 32592 9.5195 0.9606 0.9009
0.0175 97.0018 32928 6.8970 0.9626 0.9015
0.0194 98.0018 33264 8.9329 0.9604 0.8981
0.022 99.0018 33600 7.9657 0.9607 0.8993
0.0191 100.0018 33936 8.4326 0.9620 0.9000
0.0167 101.0017 34272 10.7159 0.9616 0.9025
0.0174 102.0017 34608 11.1812 0.9610 0.9008
0.0218 103.0017 34944 10.5597 0.9606 0.9003
0.0145 104.0017 35280 12.7143 0.9604 0.8987
0.0144 105.0017 35616 11.6709 0.9620 0.9048
0.0154 106.0017 35952 13.1142 0.9621 0.9029
0.0173 107.0017 36288 13.0955 0.9602 0.9023
0.017 108.0017 36624 15.2173 0.9617 0.9040
0.015 109.0017 36960 15.6876 0.9603 0.9024
0.0109 110.0017 37296 18.0828 0.9632 0.9066
0.0118 111.0017 37632 13.0177 0.9624 0.9047
0.0128 112.0017 37968 15.8398 0.9612 0.9035
0.0149 113.0017 38304 13.0027 0.9591 0.9049
0.0137 114.0016 38640 11.2986 0.9624 0.9065
0.0114 115.0016 38976 12.9842 0.9634 0.9047
0.0111 116.0016 39312 13.1338 0.9623 0.9072
0.0132 117.0016 39648 14.0856 0.9617 0.9066
0.0128 118.0016 39984 12.8400 0.9621 0.9045
0.0184 119.0016 40320 10.2663 0.9633 0.9047
0.0154 120.0016 40656 17.5847 0.9615 0.9071
0.0109 121.0016 40992 15.9856 0.9626 0.9065
0.0146 122.0016 41328 13.3834 0.9617 0.9110
0.0114 123.0016 41664 14.1382 0.9594 0.9026
0.0102 124.0016 42000 11.2743 0.9604 0.9055
0.0087 125.0016 42336 14.5875 0.9635 0.9092
0.0104 126.0016 42672 14.7828 0.9633 0.9056
0.0097 127.0016 43008 16.7864 0.9642 0.9068
0.0159 128.0015 43344 15.6876 0.9649 0.9091
0.0085 129.0015 43680 16.7435 0.9640 0.9083
0.0086 130.0015 44016 15.0204 0.9591 0.9047
0.0076 131.0015 44352 18.7707 0.9636 0.9089
0.0069 132.0015 44688 21.1344 0.9634 0.9093
0.0086 133.0015 45024 18.7865 0.9643 0.9103
0.008 134.0015 45360 21.7679 0.9631 0.9090
0.0086 135.0015 45696 21.7506 0.9635 0.9105
0.0076 136.0015 46032 22.2837 0.9606 0.9049
0.008 137.0015 46368 16.4026 0.9633 0.9073
0.0078 138.0015 46704 19.1262 0.9640 0.9114
0.0068 139.0015 47040 21.4921 0.9632 0.9107
0.006 140.0015 47376 18.9914 0.9618 0.9064
0.0105 141.0014 47712 20.2923 0.9643 0.9102
0.0078 142.0014 48048 18.3372 0.9638 0.9116
0.008 143.0014 48384 21.3536 0.9645 0.9103
0.0067 144.0014 48720 19.8095 0.9641 0.9138
0.0069 145.0014 49056 20.6651 0.9625 0.9135
0.0059 146.0014 49392 26.9011 0.9645 0.9126
0.0078 147.0014 49728 24.2397 0.9639 0.9103
0.0072 148.0014 50064 22.9266 0.9637 0.9080
0.0058 149.0014 50400 25.1212 0.9631 0.9107
0.0057 150.0014 50736 23.2056 0.9637 0.9093
0.0066 151.0014 51072 25.0150 0.9624 0.9101
0.0079 152.0014 51408 30.4183 0.9614 0.9075
0.0107 153.0014 51744 24.0838 0.9638 0.9108
0.0067 154.0014 52080 27.8235 0.9641 0.9111
0.0051 155.0013 52416 17.7165 0.9647 0.9064
0.0045 156.0013 52752 25.6944 0.9656 0.9122
0.0037 157.0013 53088 33.4122 0.9643 0.9132
0.0057 158.0013 53424 27.7718 0.9623 0.9124
0.0053 159.0013 53760 24.4108 0.9613 0.9099
0.0093 160.0013 54096 27.7845 0.9631 0.9147
0.0059 161.0013 54432 26.1313 0.9624 0.9096
0.005 162.0013 54768 25.9132 0.9646 0.9101
0.0046 163.0013 55104 27.2481 0.9645 0.9104
0.0043 164.0013 55440 26.1628 0.9632 0.9080
0.0048 165.0013 55776 27.8323 0.9654 0.9112
0.009 166.0013 56112 33.3608 0.9653 0.8949
0.0037 167.0013 56448 37.2957 0.9634 0.9085
0.0036 168.0012 56784 33.1197 0.9648 0.9126
0.0037 169.0012 57120 37.9963 0.9641 0.9108
0.0041 170.0012 57456 39.9141 0.9639 0.9084
0.0058 171.0012 57792 41.6867 0.9634 0.9117
0.0047 172.0012 58128 29.9161 0.9634 0.9035
0.0049 173.0012 58464 38.3553 0.9635 0.9133
0.0036 174.0012 58800 39.2338 0.9639 0.9120
0.0038 175.0012 59136 25.2966 0.9642 0.9124
0.0031 176.0012 59472 37.9518 0.9645 0.9098
0.0061 177.0012 59808 31.5752 0.9655 0.9159
0.0039 178.0012 60144 31.6552 0.9636 0.9110
0.0028 179.0012 60480 36.9479 0.9641 0.9113
0.0036 180.0012 60816 30.0278 0.9626 0.9123
0.0047 181.0012 61152 39.5372 0.9621 0.9120
0.0028 182.0011 61488 32.8294 0.9646 0.9110
0.0037 183.0011 61824 35.6372 0.9645 0.9153
0.004 184.0011 62160 45.1993 0.9631 0.9138
0.0037 185.0011 62496 40.1815 0.9629 0.9096
0.0045 186.0011 62832 31.0659 0.9641 0.9128
0.0031 187.0011 63168 48.8519 0.9627 0.9068
0.0034 188.0011 63504 42.3715 0.9645 0.9135
0.0031 189.0011 63840 57.5944 0.9639 0.9111
0.0027 190.0011 64176 45.7431 0.9651 0.9144
0.0033 191.0011 64512 39.2846 0.9652 0.9160
0.0024 192.0011 64848 40.1731 0.9652 0.9140
0.0025 193.0011 65184 51.9353 0.9637 0.9154
0.0021 194.0011 65520 45.9358 0.9652 0.9158
0.0031 195.0010 65856 53.5394 0.9647 0.9099
0.003 196.0010 66192 43.7920 0.9645 0.9113
0.0032 197.0010 66528 30.8682 0.9654 0.9172
0.0043 198.0010 66864 47.5004 0.9628 0.9060
0.0046 199.0010 67200 36.3755 0.9612 0.9084
0.0032 200.0010 67536 47.0794 0.9655 0.9142
0.0035 201.0010 67872 41.2678 0.9662 0.9160
0.0019 202.0010 68208 36.4351 0.9654 0.9157
0.0019 203.0010 68544 41.1641 0.9647 0.9147
0.002 204.0010 68880 34.7505 0.9653 0.9136
0.0017 205.0010 69216 45.4020 0.9655 0.9133
0.0024 206.0010 69552 32.1995 0.9647 0.9132
0.0067 207.0010 69888 43.7479 0.9609 0.9112
0.0025 208.0010 70224 42.6293 0.9646 0.9120
0.0055 209.0009 70560 51.8165 0.9606 0.9111
0.0026 210.0009 70896 47.1401 0.9646 0.9155
0.002 211.0009 71232 39.8431 0.9633 0.9102
0.0019 212.0009 71568 51.8448 0.9615 0.9107
0.0023 213.0009 71904 40.8995 0.9641 0.9138
0.0027 214.0009 72240 41.7483 0.9637 0.9137
0.0031 215.0009 72576 41.5560 0.9653 0.9160
0.0028 216.0009 72912 46.3384 0.9640 0.9126
0.0017 217.0009 73248 50.6346 0.9654 0.9140

Framework versions

  • Transformers 4.46.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.20.1
Downloads last month
4
Safetensors
Model size
31.2M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support