histv4_ftis_noPretrain

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 22.9027
  • Accuracy: 0.9708
  • Macro F1: 0.9387

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 6733
  • training_steps: 134675

Training results

Training Loss Epoch Step Validation Loss Accuracy Macro F1
5.6706 0.0050 673 163.1458 0.5351 0.1372
3.2025 1.0050 1346 66.2205 0.6024 0.1805
2.2709 2.0050 2019 23.3143 0.6566 0.2508
1.8759 3.0050 2692 14.8860 0.6862 0.3239
1.7911 4.0050 3365 9.8597 0.7467 0.4130
1.4354 5.0050 4038 6.3745 0.7854 0.4905
1.2613 6.0050 4711 7.3708 0.8181 0.5386
1.0953 7.0049 5384 9.4678 0.8289 0.5704
0.9656 8.0049 6057 10.9515 0.8501 0.6187
0.7637 9.0049 6730 9.6294 0.8727 0.6558
0.743 10.0049 7403 12.9812 0.8988 0.7010
0.5359 11.0049 8076 19.9032 0.9059 0.7257
0.5607 12.0049 8749 19.8261 0.9161 0.7501
0.4005 13.0049 9422 16.8337 0.9188 0.7727
0.3745 14.0049 10095 14.7905 0.9308 0.7819
0.3704 15.0049 10768 15.6947 0.9362 0.8003
0.3194 16.0049 11441 12.9565 0.9404 0.8072
0.2521 17.0049 12114 13.3948 0.9446 0.8214
0.2788 18.0049 12787 12.1413 0.9448 0.8331
0.24 19.0049 13460 9.2497 0.9469 0.8309
0.217 20.0048 14133 10.6283 0.9502 0.8490
0.2065 21.0048 14806 13.5652 0.9512 0.8493
0.1739 22.0048 15479 7.6495 0.9566 0.8716
0.1849 23.0048 16152 10.3599 0.9560 0.8739
0.1554 24.0048 16825 8.4639 0.9541 0.8713
0.1688 25.0048 17498 8.1545 0.9565 0.8774
0.1078 26.0048 18171 6.9724 0.9590 0.8811
0.1337 27.0048 18844 6.4333 0.9567 0.8795
0.1005 28.0048 19517 7.5471 0.9566 0.8816
0.0902 29.0048 20190 6.9489 0.9608 0.8921
0.1022 30.0048 20863 7.2287 0.9597 0.8856
0.0774 31.0048 21536 7.9467 0.9644 0.9068
0.0661 32.0048 22209 7.0721 0.9611 0.8957
0.0754 33.0048 22882 6.3389 0.9631 0.8987
0.0849 34.0047 23555 7.7401 0.9648 0.9064
0.0683 35.0047 24228 7.2394 0.9647 0.9113
0.0551 36.0047 24901 6.8625 0.9660 0.9117
0.0558 37.0047 25574 6.9998 0.9657 0.9168
0.0542 38.0047 26247 9.0917 0.9667 0.9159
0.0605 39.0047 26920 9.8845 0.9654 0.9136
0.0567 40.0047 27593 9.5249 0.9646 0.9159
0.0553 41.0047 28266 7.5163 0.9656 0.9174
0.0483 42.0047 28939 7.9149 0.9667 0.9198
0.0444 43.0047 29612 7.2773 0.9650 0.9156
0.0493 44.0047 30285 8.2501 0.9689 0.9009
0.042 45.0047 30958 8.3921 0.9668 0.9168
0.0332 46.0047 31631 7.8757 0.9668 0.9179
0.0562 47.0046 32304 10.3526 0.9671 0.9195
0.0463 48.0046 32977 10.4190 0.9683 0.9207
0.0337 49.0046 33650 8.7521 0.9680 0.9160
0.0299 50.0046 34323 8.7141 0.9666 0.9248
0.038 51.0046 34996 8.1480 0.9670 0.9174
0.0344 52.0046 35669 9.5917 0.9671 0.9102
0.0378 53.0046 36342 9.7323 0.9688 0.9243
0.0316 54.0046 37015 10.4115 0.9639 0.9186
0.0314 55.0046 37688 10.1031 0.9676 0.9219
0.0266 56.0046 38361 11.5819 0.9692 0.9295
0.0281 57.0046 39034 9.8724 0.9666 0.9185
0.0286 58.0046 39707 9.4405 0.9683 0.9241
0.0336 59.0046 40380 11.8320 0.9694 0.9283
0.0251 60.0046 41053 12.5118 0.9688 0.9333
0.0217 61.0045 41726 15.1022 0.9705 0.9322
0.0235 62.0045 42399 11.6105 0.9680 0.9263
0.0229 63.0045 43072 12.1260 0.9672 0.9257
0.0222 64.0045 43745 12.9063 0.9658 0.9323
0.0242 65.0045 44418 9.6198 0.9680 0.9285
0.0218 66.0045 45091 13.3342 0.9689 0.9320
0.0206 67.0045 45764 15.4714 0.9695 0.9344
0.0189 68.0045 46437 10.6895 0.9676 0.9269
0.0196 69.0045 47110 13.4871 0.9693 0.9312
0.017 70.0045 47783 14.8361 0.9686 0.9295
0.0185 71.0045 48456 13.1012 0.9677 0.9301
0.0177 72.0045 49129 12.6586 0.9673 0.9257
0.0188 73.0045 49802 16.0946 0.9689 0.9293
0.0164 74.0044 50475 14.0022 0.9700 0.9288
0.0162 75.0044 51148 10.8041 0.9672 0.9267
0.0188 76.0044 51821 12.3388 0.9675 0.9259
0.0158 77.0044 52494 14.4179 0.9697 0.9321
0.02 78.0044 53167 13.8769 0.9700 0.9329
0.0179 79.0044 53840 13.2398 0.9673 0.9306
0.0145 80.0044 54513 14.6983 0.9704 0.9281
0.0136 81.0044 55186 15.3679 0.9687 0.9362
0.0128 82.0044 55859 15.5807 0.9715 0.9356
0.0147 83.0044 56532 12.4938 0.9677 0.9299
0.017 84.0044 57205 14.0998 0.9707 0.9362
0.0122 85.0044 57878 15.6111 0.9685 0.9345
0.0151 86.0044 58551 11.5770 0.9682 0.9294
0.0118 87.0044 59224 13.1092 0.9696 0.9320
0.0136 88.0043 59897 14.3885 0.9681 0.9281
0.0126 89.0043 60570 15.1592 0.9697 0.9300
0.0147 90.0043 61243 16.0319 0.9690 0.9364
0.0127 91.0043 61916 14.8849 0.9694 0.9306
0.0123 92.0043 62589 15.1080 0.9700 0.9373
0.012 93.0043 63262 12.8495 0.9690 0.9342
0.0124 94.0043 63935 12.7915 0.9690 0.9313
0.0102 95.0043 64608 18.5121 0.9703 0.9330
0.0105 96.0043 65281 17.5668 0.9702 0.9338
0.0108 97.0043 65954 17.9356 0.9687 0.9318
0.0096 98.0043 66627 15.8806 0.9703 0.9344
0.0089 99.0043 67300 12.6527 0.9701 0.9347
0.0092 100.0043 67973 19.8478 0.9701 0.9329
0.0098 101.0042 68646 16.3105 0.9703 0.9374
0.0091 102.0042 69319 19.2020 0.9691 0.9310
0.0106 103.0042 69992 21.6559 0.9698 0.9293
0.0092 104.0042 70665 18.8253 0.9695 0.9291
0.0084 105.0042 71338 20.1503 0.9693 0.9308
0.0081 106.0042 72011 18.6289 0.9706 0.9351
0.0072 107.0042 72684 19.9631 0.9704 0.9334
0.01 108.0042 73357 23.2103 0.9687 0.9270
0.0078 109.0042 74030 21.5416 0.9697 0.9358
0.0077 110.0042 74703 23.7068 0.9707 0.9341
0.0073 111.0042 75376 22.0645 0.9707 0.9372
0.0074 112.0042 76049 21.4179 0.9704 0.9299
0.0092 113.0042 76722 23.4124 0.9706 0.9365
0.0087 114.0042 77395 26.8832 0.9712 0.9315
0.0077 115.0041 78068 24.6745 0.9708 0.9382
0.0069 116.0041 78741 23.7589 0.9708 0.9387
0.0085 117.0041 79414 19.9101 0.9697 0.9306
0.0065 118.0041 80087 19.5109 0.9703 0.9359
0.0061 119.0041 80760 22.7762 0.9699 0.9290
0.0064 120.0041 81433 24.8343 0.9701 0.9324
0.0063 121.0041 82106 26.0722 0.9711 0.9360
0.0067 122.0041 82779 21.4954 0.9709 0.9317
0.0071 123.0041 83452 27.8143 0.9701 0.9324
0.0058 124.0041 84125 30.4180 0.9693 0.9324
0.0073 125.0041 84798 28.6532 0.9695 0.9315
0.0055 126.0041 85471 31.8040 0.9706 0.9319
0.0056 127.0041 86144 32.3538 0.9691 0.9293
0.0055 128.0040 86817 30.3247 0.9700 0.9310
0.0066 129.0040 87490 36.1489 0.9695 0.9343
0.0049 130.0040 88163 34.2556 0.9686 0.9323
0.0049 131.0040 88836 33.8613 0.9698 0.9371
0.0061 132.0040 89509 33.1326 0.9701 0.9321
0.0048 133.0040 90182 34.4226 0.9702 0.9300
0.0052 134.0040 90855 28.6254 0.9692 0.9338
0.0046 135.0040 91528 33.8544 0.9693 0.9307
0.0041 136.0040 92201 37.8675 0.9704 0.9332

Framework versions

  • Transformers 4.46.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.20.1
Downloads last month
5
Safetensors
Model size
31.2M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support