Edit model card

swin-tiny-patch4-window7-224-finetuned-eurosat

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3209
  • Accuracy: 0.8902

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 8 2.7448 0.0314
2.7716 2.0 16 2.5834 0.1765
2.5974 3.0 24 2.3608 0.3020
2.3426 4.0 32 2.1157 0.3333
1.9747 5.0 40 1.7539 0.4627
1.9747 6.0 48 1.3641 0.6078
1.5182 7.0 56 1.0755 0.6471
1.198 8.0 64 0.8743 0.7216
1.0206 9.0 72 0.7666 0.7294
0.8731 10.0 80 0.7035 0.7490
0.8731 11.0 88 0.6122 0.7608
0.7938 12.0 96 0.6508 0.7490
0.7286 13.0 104 0.5081 0.7961
0.659 14.0 112 0.5536 0.7961
0.6232 15.0 120 0.5079 0.8
0.6232 16.0 128 0.4483 0.8314
0.6028 17.0 136 0.4096 0.8157
0.5333 18.0 144 0.3710 0.8510
0.5053 19.0 152 0.4810 0.8039
0.4717 20.0 160 0.4121 0.8235
0.4717 21.0 168 0.4021 0.8392
0.4728 22.0 176 0.3780 0.8588
0.4347 23.0 184 0.3374 0.8745
0.4545 24.0 192 0.4056 0.8431
0.3954 25.0 200 0.4088 0.8745
0.3954 26.0 208 0.4169 0.8392
0.4145 27.0 216 0.3262 0.8706
0.3895 28.0 224 0.4235 0.8706
0.4185 29.0 232 0.3482 0.8706
0.3686 30.0 240 0.3088 0.8824
0.3686 31.0 248 0.3230 0.8902
0.3617 32.0 256 0.3473 0.8824
0.3136 33.0 264 0.3793 0.8627
0.3482 34.0 272 0.3477 0.8588
0.3519 35.0 280 0.3692 0.8667
0.3519 36.0 288 0.3611 0.8627
0.3311 37.0 296 0.3233 0.8745
0.3222 38.0 304 0.3416 0.8627
0.3013 39.0 312 0.3198 0.8824
0.2871 40.0 320 0.3308 0.8667
0.2871 41.0 328 0.3246 0.8667
0.3154 42.0 336 0.3943 0.8667
0.2735 43.0 344 0.3186 0.8784
0.2911 44.0 352 0.3132 0.8824
0.266 45.0 360 0.3204 0.8980
0.266 46.0 368 0.3097 0.8784
0.2686 47.0 376 0.3075 0.8902
0.2818 48.0 384 0.3192 0.8902
0.2492 49.0 392 0.3434 0.8745
0.276 50.0 400 0.3237 0.8824
0.276 51.0 408 0.3450 0.8745
0.245 52.0 416 0.3284 0.8706
0.2292 53.0 424 0.3263 0.8902
0.2252 54.0 432 0.3216 0.8745
0.2483 55.0 440 0.3359 0.8863
0.2483 56.0 448 0.3314 0.8902
0.2549 57.0 456 0.3932 0.8745
0.2247 58.0 464 0.3189 0.8745
0.2344 59.0 472 0.3251 0.8745
0.2315 60.0 480 0.3289 0.8824
0.2315 61.0 488 0.3058 0.8745
0.2109 62.0 496 0.2999 0.8863
0.2325 63.0 504 0.3078 0.8980
0.2126 64.0 512 0.3531 0.8784
0.1975 65.0 520 0.3394 0.8902
0.1975 66.0 528 0.3113 0.8902
0.1998 67.0 536 0.3365 0.8941
0.2208 68.0 544 0.2854 0.9020
0.2126 69.0 552 0.3170 0.8941
0.2352 70.0 560 0.3155 0.8824
0.2352 71.0 568 0.3327 0.8824
0.1724 72.0 576 0.3503 0.8902
0.2038 73.0 584 0.3309 0.8824
0.1919 74.0 592 0.3299 0.8902
0.2199 75.0 600 0.3347 0.8863
0.2199 76.0 608 0.3471 0.8824
0.2075 77.0 616 0.3437 0.8863
0.2206 78.0 624 0.3161 0.8824
0.1655 79.0 632 0.3227 0.8784
0.1765 80.0 640 0.3302 0.8784
0.1765 81.0 648 0.3153 0.8745
0.1832 82.0 656 0.3010 0.8745
0.185 83.0 664 0.3266 0.8941
0.1627 84.0 672 0.3192 0.8941
0.176 85.0 680 0.3125 0.8863
0.176 86.0 688 0.3241 0.8745
0.1723 87.0 696 0.3124 0.8784
0.1477 88.0 704 0.3109 0.8745
0.1703 89.0 712 0.3196 0.8824
0.1919 90.0 720 0.3186 0.8980
0.1919 91.0 728 0.3178 0.8902
0.1465 92.0 736 0.3241 0.8824
0.155 93.0 744 0.3281 0.8784
0.1829 94.0 752 0.3263 0.8824
0.167 95.0 760 0.3282 0.8824
0.167 96.0 768 0.3290 0.8824
0.166 97.0 776 0.3253 0.8902
0.1756 98.0 784 0.3231 0.8863
0.157 99.0 792 0.3215 0.8902
0.1492 100.0 800 0.3209 0.8902

Framework versions

  • Transformers 4.33.3
  • Pytorch 1.11.0+cu113
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for aichoux/swin-tiny-patch4-window7-224-finetuned-eurosat

Finetuned
(464)
this model

Evaluation results