swinv2-tiny-patch4-window8-256-dmae-humeda-DAV77
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3654
- Accuracy: 0.8914
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.1317 | 0.9524 | 15 | 1.0850 | 0.3657 |
1.0134 | 1.9524 | 30 | 1.0048 | 0.56 |
0.9444 | 2.9524 | 45 | 0.7731 | 0.7371 |
0.6457 | 3.9524 | 60 | 0.5598 | 0.7486 |
0.5593 | 4.9524 | 75 | 0.4836 | 0.7771 |
0.4781 | 5.9524 | 90 | 0.4486 | 0.7829 |
0.4769 | 6.9524 | 105 | 0.4365 | 0.8114 |
0.4366 | 7.9524 | 120 | 0.5434 | 0.7657 |
0.4248 | 8.9524 | 135 | 0.3744 | 0.8457 |
0.3298 | 9.9524 | 150 | 0.3618 | 0.8343 |
0.342 | 10.9524 | 165 | 0.3661 | 0.8629 |
0.3033 | 11.9524 | 180 | 0.3753 | 0.8343 |
0.3106 | 12.9524 | 195 | 0.4607 | 0.8229 |
0.264 | 13.9524 | 210 | 0.3623 | 0.8457 |
0.2407 | 14.9524 | 225 | 0.3982 | 0.8343 |
0.2758 | 15.9524 | 240 | 0.3694 | 0.8514 |
0.2719 | 16.9524 | 255 | 0.5112 | 0.8171 |
0.2311 | 17.9524 | 270 | 0.3977 | 0.8571 |
0.246 | 18.9524 | 285 | 0.4087 | 0.8629 |
0.2193 | 19.9524 | 300 | 0.4239 | 0.8343 |
0.2419 | 20.9524 | 315 | 0.3980 | 0.8514 |
0.2084 | 21.9524 | 330 | 0.4278 | 0.8686 |
0.1973 | 22.9524 | 345 | 0.3654 | 0.8914 |
0.1807 | 23.9524 | 360 | 0.4050 | 0.8629 |
0.1693 | 24.9524 | 375 | 0.5299 | 0.8229 |
0.1594 | 25.9524 | 390 | 0.4832 | 0.8514 |
0.1876 | 26.9524 | 405 | 0.5069 | 0.84 |
0.1514 | 27.9524 | 420 | 0.5056 | 0.8571 |
0.1818 | 28.9524 | 435 | 0.5403 | 0.8286 |
0.1682 | 29.9524 | 450 | 0.5058 | 0.84 |
0.1681 | 30.9524 | 465 | 0.5187 | 0.8114 |
0.1394 | 31.9524 | 480 | 0.5843 | 0.8629 |
0.1659 | 32.9524 | 495 | 0.4707 | 0.8629 |
0.1753 | 33.9524 | 510 | 0.5603 | 0.8229 |
0.1884 | 34.9524 | 525 | 0.5372 | 0.8343 |
0.1399 | 35.9524 | 540 | 0.5559 | 0.8629 |
0.1603 | 36.9524 | 555 | 0.6177 | 0.8629 |
0.1353 | 37.9524 | 570 | 0.5262 | 0.8457 |
0.0874 | 38.9524 | 585 | 0.4945 | 0.8629 |
0.1054 | 39.9524 | 600 | 0.6391 | 0.8629 |
0.1156 | 40.9524 | 615 | 0.6080 | 0.8514 |
0.1247 | 41.9524 | 630 | 0.6483 | 0.8114 |
0.1396 | 42.9524 | 645 | 0.5377 | 0.8457 |
0.117 | 43.9524 | 660 | 0.5460 | 0.8629 |
0.1403 | 44.9524 | 675 | 0.6856 | 0.84 |
0.1089 | 45.9524 | 690 | 0.6401 | 0.8514 |
0.1022 | 46.9524 | 705 | 0.6795 | 0.8514 |
0.09 | 47.9524 | 720 | 0.6025 | 0.8457 |
0.0948 | 48.9524 | 735 | 0.6489 | 0.8514 |
0.1177 | 49.9524 | 750 | 0.6105 | 0.8571 |
0.0797 | 50.9524 | 765 | 0.7485 | 0.8229 |
0.0872 | 51.9524 | 780 | 0.6390 | 0.84 |
0.1038 | 52.9524 | 795 | 0.6190 | 0.8743 |
0.1361 | 53.9524 | 810 | 0.6417 | 0.8457 |
0.1205 | 54.9524 | 825 | 0.6161 | 0.84 |
0.1026 | 55.9524 | 840 | 0.5836 | 0.8514 |
0.1059 | 56.9524 | 855 | 0.6865 | 0.8571 |
0.0999 | 57.9524 | 870 | 0.7455 | 0.8629 |
0.1075 | 58.9524 | 885 | 0.7018 | 0.8343 |
0.0952 | 59.9524 | 900 | 0.6851 | 0.8286 |
0.0796 | 60.9524 | 915 | 0.6301 | 0.8514 |
0.0952 | 61.9524 | 930 | 0.6734 | 0.8343 |
0.1041 | 62.9524 | 945 | 0.6475 | 0.8514 |
0.0961 | 63.9524 | 960 | 0.7369 | 0.8229 |
0.0897 | 64.9524 | 975 | 0.7261 | 0.84 |
0.0591 | 65.9524 | 990 | 0.7303 | 0.8286 |
0.106 | 66.9524 | 1005 | 0.6512 | 0.84 |
0.0817 | 67.9524 | 1020 | 0.6835 | 0.8229 |
0.0653 | 68.9524 | 1035 | 0.7211 | 0.8514 |
0.0801 | 69.9524 | 1050 | 0.7762 | 0.8343 |
0.0754 | 70.9524 | 1065 | 0.7669 | 0.8571 |
0.067 | 71.9524 | 1080 | 0.8578 | 0.8457 |
0.0896 | 72.9524 | 1095 | 0.8271 | 0.84 |
0.0622 | 73.9524 | 1110 | 0.7458 | 0.8286 |
0.0741 | 74.9524 | 1125 | 0.7236 | 0.84 |
0.0687 | 75.9524 | 1140 | 0.7986 | 0.84 |
0.0877 | 76.9524 | 1155 | 0.7999 | 0.8286 |
0.1034 | 77.9524 | 1170 | 0.7840 | 0.8286 |
0.0716 | 78.9524 | 1185 | 0.7871 | 0.8343 |
0.0659 | 79.9524 | 1200 | 0.7860 | 0.8571 |
0.0844 | 80.9524 | 1215 | 0.8366 | 0.8514 |
0.0858 | 81.9524 | 1230 | 0.8152 | 0.8629 |
0.0531 | 82.9524 | 1245 | 0.7717 | 0.8286 |
0.075 | 83.9524 | 1260 | 0.8578 | 0.8171 |
0.059 | 84.9524 | 1275 | 0.8240 | 0.8229 |
0.0896 | 85.9524 | 1290 | 0.8907 | 0.8343 |
0.0741 | 86.9524 | 1305 | 0.8814 | 0.84 |
0.0697 | 87.9524 | 1320 | 0.9080 | 0.8286 |
0.0552 | 88.9524 | 1335 | 0.8345 | 0.8343 |
0.0576 | 89.9524 | 1350 | 0.8746 | 0.8229 |
0.0729 | 90.9524 | 1365 | 0.8196 | 0.8343 |
0.0782 | 91.9524 | 1380 | 0.8073 | 0.8343 |
0.0584 | 92.9524 | 1395 | 0.8011 | 0.8286 |
0.0471 | 93.9524 | 1410 | 0.8076 | 0.8286 |
0.0544 | 94.9524 | 1425 | 0.8390 | 0.8229 |
0.0576 | 95.9524 | 1440 | 0.8575 | 0.8286 |
0.0608 | 96.9524 | 1455 | 0.8392 | 0.8229 |
0.064 | 97.9524 | 1470 | 0.8266 | 0.8286 |
0.0742 | 98.9524 | 1485 | 0.8311 | 0.8286 |
0.0471 | 99.9524 | 1500 | 0.8333 | 0.8286 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 2.19.0
- Tokenizers 0.21.1
- Downloads last month
- 7
Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV77
Base model
microsoft/swinv2-tiny-patch4-window8-256