segformer-b0-finetuned-batch3-26May-2

This model is a fine-tuned version of PushkarA07/segformer-b0-finetuned-batch2w5-15Dec on the PushkarA07/batch3-tiles_third dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0007
  • Mean Iou: 0.9173
  • Mean Accuracy: 0.9515
  • Overall Accuracy: 0.9997
  • Accuracy Abnormality: 0.9030
  • Iou Abnormality: 0.8348

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Abnormality Iou Abnormality
0.0012 0.7143 10 0.0017 0.8437 0.8917 0.9994 0.7835 0.6879
0.0012 1.4286 20 0.0013 0.8539 0.8779 0.9995 0.7559 0.7082
0.001 2.1429 30 0.0012 0.8684 0.8944 0.9996 0.7889 0.7372
0.0006 2.8571 40 0.0011 0.8746 0.8991 0.9996 0.7983 0.7496
0.001 3.5714 50 0.0010 0.8839 0.9185 0.9996 0.8371 0.7681
0.0012 4.2857 60 0.0010 0.8867 0.9189 0.9996 0.8380 0.7737
0.0022 5.0 70 0.0010 0.8901 0.9211 0.9996 0.8423 0.7806
0.0017 5.7143 80 0.0009 0.8913 0.9254 0.9996 0.8510 0.7829
0.0016 6.4286 90 0.0009 0.8921 0.9237 0.9996 0.8475 0.7846
0.001 7.1429 100 0.0009 0.8946 0.9278 0.9996 0.8557 0.7895
0.0012 7.8571 110 0.0009 0.8935 0.9226 0.9996 0.8453 0.7873
0.0011 8.5714 120 0.0009 0.8963 0.9314 0.9996 0.8629 0.7929
0.001 9.2857 130 0.0009 0.8980 0.9325 0.9996 0.8652 0.7963
0.0006 10.0 140 0.0009 0.8978 0.9303 0.9996 0.8608 0.7959
0.001 10.7143 150 0.0009 0.8996 0.9366 0.9997 0.8732 0.7995
0.001 11.4286 160 0.0009 0.9016 0.9463 0.9997 0.8928 0.8036
0.0004 12.1429 170 0.0009 0.9019 0.9494 0.9997 0.8990 0.8042
0.0002 12.8571 180 0.0009 0.9004 0.9341 0.9997 0.8683 0.8012
0.0011 13.5714 190 0.0009 0.9026 0.9488 0.9997 0.8977 0.8055
0.0005 14.2857 200 0.0008 0.9014 0.9385 0.9997 0.8772 0.8031
0.0007 15.0 210 0.0008 0.9013 0.9354 0.9997 0.8709 0.8028
0.0013 15.7143 220 0.0008 0.9047 0.9445 0.9997 0.8892 0.8098
0.0004 16.4286 230 0.0008 0.9015 0.9334 0.9997 0.8670 0.8034
0.0009 17.1429 240 0.0008 0.9057 0.9500 0.9997 0.9002 0.8117
0.0016 17.8571 250 0.0008 0.9060 0.9451 0.9997 0.8904 0.8124
0.0011 18.5714 260 0.0008 0.9052 0.9432 0.9997 0.8865 0.8107
0.0007 19.2857 270 0.0008 0.9069 0.9476 0.9997 0.8953 0.8141
0.0007 20.0 280 0.0008 0.9073 0.9488 0.9997 0.8977 0.8150
0.001 20.7143 290 0.0008 0.9033 0.9329 0.9997 0.8660 0.8068
0.0006 21.4286 300 0.0008 0.9079 0.9492 0.9997 0.8985 0.8162
0.0009 22.1429 310 0.0008 0.9070 0.9494 0.9997 0.8990 0.8143
0.0007 22.8571 320 0.0008 0.9070 0.9438 0.9997 0.8877 0.8142
0.0006 23.5714 330 0.0008 0.9071 0.9458 0.9997 0.8918 0.8146
0.001 24.2857 340 0.0008 0.9088 0.9455 0.9997 0.8912 0.8179
0.0006 25.0 350 0.0008 0.9105 0.9477 0.9997 0.8955 0.8214
0.0009 25.7143 360 0.0008 0.9090 0.9477 0.9997 0.8955 0.8184
0.001 26.4286 370 0.0008 0.9096 0.9521 0.9997 0.9043 0.8196
0.0012 27.1429 380 0.0008 0.9089 0.9465 0.9997 0.8931 0.8181
0.0006 27.8571 390 0.0008 0.9100 0.9487 0.9997 0.8976 0.8203
0.0006 28.5714 400 0.0008 0.9097 0.9484 0.9997 0.8970 0.8198
0.0004 29.2857 410 0.0008 0.9088 0.9565 0.9997 0.9131 0.8179
0.0013 30.0 420 0.0008 0.9073 0.9413 0.9997 0.8828 0.8150
0.0007 30.7143 430 0.0008 0.9086 0.9441 0.9997 0.8883 0.8176
0.0011 31.4286 440 0.0008 0.9109 0.9575 0.9997 0.9151 0.8221
0.0004 32.1429 450 0.0008 0.9112 0.9525 0.9997 0.9051 0.8227
0.0011 32.8571 460 0.0008 0.9118 0.9469 0.9997 0.8939 0.8239
0.0006 33.5714 470 0.0008 0.9112 0.9559 0.9997 0.9119 0.8228
0.0004 34.2857 480 0.0008 0.9104 0.9535 0.9997 0.9072 0.8210
0.0006 35.0 490 0.0008 0.9107 0.9450 0.9997 0.8902 0.8218
0.0011 35.7143 500 0.0008 0.9128 0.9509 0.9997 0.9019 0.8258
0.0004 36.4286 510 0.0008 0.9118 0.9502 0.9997 0.9005 0.8239
0.0007 37.1429 520 0.0008 0.9135 0.9534 0.9997 0.9070 0.8273
0.0005 37.8571 530 0.0008 0.9106 0.9422 0.9997 0.8845 0.8216
0.0011 38.5714 540 0.0008 0.9125 0.9501 0.9997 0.9004 0.8252
0.0006 39.2857 550 0.0008 0.9130 0.9553 0.9997 0.9107 0.8264
0.001 40.0 560 0.0008 0.9110 0.9454 0.9997 0.8909 0.8224
0.001 40.7143 570 0.0008 0.9135 0.9546 0.9997 0.9094 0.8272
0.0009 41.4286 580 0.0008 0.9131 0.9529 0.9997 0.9060 0.8265
0.0007 42.1429 590 0.0008 0.9112 0.9479 0.9997 0.8959 0.8227
0.0005 42.8571 600 0.0007 0.9131 0.9514 0.9997 0.9029 0.8265
0.0005 43.5714 610 0.0008 0.9110 0.9435 0.9997 0.8871 0.8224
0.0005 44.2857 620 0.0008 0.9126 0.9575 0.9997 0.9152 0.8255
0.0003 45.0 630 0.0007 0.9121 0.9480 0.9997 0.8962 0.8244
0.0003 45.7143 640 0.0008 0.9109 0.9432 0.9997 0.8865 0.8221
0.0006 46.4286 650 0.0007 0.9139 0.9519 0.9997 0.9039 0.8281
0.0003 47.1429 660 0.0008 0.9132 0.9547 0.9997 0.9096 0.8267
0.0012 47.8571 670 0.0008 0.9114 0.9444 0.9997 0.8888 0.8230
0.0008 48.5714 680 0.0007 0.9138 0.9546 0.9997 0.9093 0.8279
0.001 49.2857 690 0.0007 0.9136 0.9512 0.9997 0.9025 0.8275
0.0009 50.0 700 0.0007 0.9127 0.9490 0.9997 0.8982 0.8258
0.0006 50.7143 710 0.0007 0.9143 0.9527 0.9997 0.9055 0.8289
0.0011 51.4286 720 0.0007 0.9127 0.9475 0.9997 0.8951 0.8257
0.0003 52.1429 730 0.0007 0.9138 0.9500 0.9997 0.9002 0.8280
0.0005 52.8571 740 0.0007 0.9141 0.9541 0.9997 0.9083 0.8285
0.0011 53.5714 750 0.0007 0.9146 0.9526 0.9997 0.9052 0.8295
0.0005 54.2857 760 0.0007 0.9139 0.9509 0.9997 0.9019 0.8281
0.0005 55.0 770 0.0007 0.9134 0.9468 0.9997 0.8937 0.8270
0.0009 55.7143 780 0.0007 0.9150 0.9528 0.9997 0.9058 0.8302
0.0011 56.4286 790 0.0007 0.9133 0.9461 0.9997 0.8924 0.8268
0.0015 57.1429 800 0.0007 0.9143 0.9507 0.9997 0.9016 0.8289
0.0009 57.8571 810 0.0007 0.9148 0.9509 0.9997 0.9019 0.8299
0.0006 58.5714 820 0.0007 0.9146 0.9507 0.9997 0.9015 0.8294
0.0003 59.2857 830 0.0007 0.9152 0.9530 0.9997 0.9062 0.8307
0.0006 60.0 840 0.0007 0.9144 0.9487 0.9997 0.8974 0.8292
0.0006 60.7143 850 0.0007 0.9149 0.9529 0.9997 0.9060 0.8300
0.0006 61.4286 860 0.0007 0.9159 0.9556 0.9997 0.9115 0.8320
0.0004 62.1429 870 0.0007 0.9143 0.9499 0.9997 0.8999 0.8288
0.0008 62.8571 880 0.0007 0.9150 0.9537 0.9997 0.9076 0.8303
0.0008 63.5714 890 0.0007 0.9154 0.9493 0.9997 0.8987 0.8311
0.0006 64.2857 900 0.0007 0.9158 0.9572 0.9997 0.9146 0.8319
0.0013 65.0 910 0.0007 0.9150 0.9509 0.9997 0.9020 0.8304
0.0008 65.7143 920 0.0007 0.9148 0.9487 0.9997 0.8974 0.8300
0.0009 66.4286 930 0.0007 0.9164 0.9555 0.9997 0.9111 0.8332
0.0007 67.1429 940 0.0007 0.9167 0.9521 0.9997 0.9043 0.8337
0.0005 67.8571 950 0.0007 0.9163 0.9540 0.9997 0.9082 0.8328
0.0009 68.5714 960 0.0007 0.9157 0.9489 0.9997 0.8979 0.8316
0.001 69.2857 970 0.0007 0.9160 0.9548 0.9997 0.9098 0.8322
0.0006 70.0 980 0.0007 0.9156 0.9492 0.9997 0.8985 0.8315
0.001 70.7143 990 0.0007 0.9160 0.9507 0.9997 0.9015 0.8323
0.0006 71.4286 1000 0.0007 0.9154 0.9484 0.9997 0.8970 0.8310
0.0014 72.1429 1010 0.0007 0.9165 0.9534 0.9997 0.9068 0.8332
0.0008 72.8571 1020 0.0007 0.9165 0.9513 0.9997 0.9028 0.8333
0.0007 73.5714 1030 0.0007 0.9167 0.9530 0.9997 0.9061 0.8338
0.0008 74.2857 1040 0.0007 0.9159 0.9526 0.9997 0.9052 0.8321
0.0006 75.0 1050 0.0007 0.9154 0.9503 0.9997 0.9007 0.8312
0.0007 75.7143 1060 0.0007 0.9165 0.9545 0.9997 0.9091 0.8332
0.0011 76.4286 1070 0.0007 0.9168 0.9543 0.9997 0.9087 0.8338
0.0009 77.1429 1080 0.0007 0.9158 0.9527 0.9997 0.9055 0.8320
0.0005 77.8571 1090 0.0007 0.9168 0.9511 0.9997 0.9023 0.8338
0.0005 78.5714 1100 0.0007 0.9162 0.9502 0.9997 0.9005 0.8328
0.0009 79.2857 1110 0.0007 0.9174 0.9533 0.9997 0.9068 0.8350
0.0004 80.0 1120 0.0007 0.9162 0.9495 0.9997 0.8990 0.8327
0.0002 80.7143 1130 0.0007 0.9165 0.9507 0.9997 0.9014 0.8332
0.0005 81.4286 1140 0.0007 0.9164 0.9499 0.9997 0.8999 0.8332
0.0009 82.1429 1150 0.0007 0.9170 0.9543 0.9997 0.9087 0.8342
0.0009 82.8571 1160 0.0007 0.9165 0.9523 0.9997 0.9048 0.8334
0.0006 83.5714 1170 0.0007 0.9165 0.9519 0.9997 0.9039 0.8332
0.0008 84.2857 1180 0.0007 0.9161 0.9515 0.9997 0.9032 0.8325
0.0006 85.0 1190 0.0007 0.9169 0.9525 0.9997 0.9051 0.8340
0.0005 85.7143 1200 0.0007 0.9167 0.9518 0.9997 0.9037 0.8337
0.0002 86.4286 1210 0.0007 0.9167 0.9519 0.9997 0.9040 0.8337
0.0004 87.1429 1220 0.0007 0.9167 0.9518 0.9997 0.9037 0.8337
0.0009 87.8571 1230 0.0007 0.9169 0.9520 0.9997 0.9042 0.8340
0.0011 88.5714 1240 0.0007 0.9171 0.9526 0.9997 0.9053 0.8345
0.0006 89.2857 1250 0.0007 0.9171 0.9518 0.9997 0.9037 0.8346
0.0007 90.0 1260 0.0007 0.9174 0.9551 0.9997 0.9104 0.8351
0.0005 90.7143 1270 0.0007 0.9168 0.9534 0.9997 0.9069 0.8340
0.0007 91.4286 1280 0.0007 0.9169 0.9519 0.9997 0.9040 0.8341
0.0009 92.1429 1290 0.0007 0.9175 0.9526 0.9997 0.9052 0.8352
0.0009 92.8571 1300 0.0007 0.9177 0.9532 0.9997 0.9066 0.8356
0.0007 93.5714 1310 0.0007 0.9174 0.9525 0.9997 0.9051 0.8351
0.0007 94.2857 1320 0.0007 0.9170 0.9518 0.9997 0.9037 0.8343
0.0015 95.0 1330 0.0007 0.9173 0.9535 0.9997 0.9071 0.8349
0.0005 95.7143 1340 0.0007 0.9176 0.9534 0.9997 0.9069 0.8355
0.0007 96.4286 1350 0.0007 0.9174 0.9525 0.9997 0.9051 0.8351
0.001 97.1429 1360 0.0007 0.9175 0.9527 0.9997 0.9056 0.8353
0.001 97.8571 1370 0.0007 0.9175 0.9526 0.9997 0.9052 0.8354
0.0007 98.5714 1380 0.0007 0.9173 0.9518 0.9997 0.9037 0.8349
0.0006 99.2857 1390 0.0007 0.9175 0.9514 0.9997 0.9029 0.8352
0.0011 100.0 1400 0.0007 0.9173 0.9515 0.9997 0.9030 0.8348

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
92
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for PushkarA07/segformer-b0-finetuned-batch3-26May-2

Unable to build the model tree, the base model loops to the model itself. Learn more.