model_id
stringlengths 12
92
| model_card
stringlengths 166
900k
| model_labels
listlengths 2
250
|
---|---|---|
mujerry/deeplabv3-mobilevit-small_corm |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deeplabv3-mobilevit-small_corm
This model is a fine-tuned version of [apple/deeplabv3-mobilevit-small](https://huggingface.co/apple/deeplabv3-mobilevit-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7777
- Mean Iou: 0.4137
- Mean Accuracy: 0.5038
- Overall Accuracy: 0.7714
- Accuracy Background: 0.9998
- Accuracy Corm: 0.0748
- Accuracy Damage: 0.4368
- Iou Background: 0.7626
- Iou Corm: 0.0734
- Iou Damage: 0.4050
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Corm | Accuracy Damage | Iou Background | Iou Corm | Iou Damage |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:---------------:|:--------------:|:--------:|:----------:|
| 1.0611 | 0.3077 | 20 | 1.0627 | 0.4251 | 0.6398 | 0.6594 | 0.6709 | 0.5632 | 0.6853 | 0.6508 | 0.2723 | 0.3522 |
| 1.0077 | 0.6154 | 40 | 1.0227 | 0.5359 | 0.6715 | 0.8018 | 0.9058 | 0.4028 | 0.7060 | 0.8285 | 0.3049 | 0.4743 |
| 1.0089 | 0.9231 | 60 | 0.9859 | 0.5131 | 0.6138 | 0.8167 | 0.9804 | 0.2104 | 0.6506 | 0.8345 | 0.1883 | 0.5167 |
| 0.9397 | 1.2308 | 80 | 0.9637 | 0.4767 | 0.5831 | 0.8103 | 0.9867 | 0.0722 | 0.6904 | 0.8293 | 0.0701 | 0.5307 |
| 0.9347 | 1.5385 | 100 | 0.9257 | 0.4544 | 0.5545 | 0.7991 | 0.9946 | 0.0517 | 0.6172 | 0.8103 | 0.0504 | 0.5024 |
| 0.9007 | 1.8462 | 120 | 0.9054 | 0.4458 | 0.5428 | 0.7926 | 0.9968 | 0.0678 | 0.5637 | 0.8012 | 0.0658 | 0.4705 |
| 0.8787 | 2.1538 | 140 | 0.8756 | 0.4195 | 0.5140 | 0.7780 | 0.9986 | 0.0506 | 0.4927 | 0.7790 | 0.0495 | 0.4301 |
| 0.8757 | 2.4615 | 160 | 0.8501 | 0.4035 | 0.4967 | 0.7677 | 0.9994 | 0.0659 | 0.4247 | 0.7656 | 0.0645 | 0.3804 |
| 0.841 | 2.7692 | 180 | 0.8339 | 0.4199 | 0.5148 | 0.7799 | 0.9987 | 0.0283 | 0.5174 | 0.7791 | 0.0279 | 0.4528 |
| 0.8268 | 3.0769 | 200 | 0.8246 | 0.4358 | 0.5279 | 0.7844 | 0.9989 | 0.0809 | 0.5040 | 0.7826 | 0.0789 | 0.4460 |
| 0.8306 | 3.3846 | 220 | 0.8095 | 0.4034 | 0.4968 | 0.7690 | 0.9995 | 0.0461 | 0.4448 | 0.7653 | 0.0455 | 0.3995 |
| 0.826 | 3.6923 | 240 | 0.7928 | 0.4174 | 0.5078 | 0.7731 | 0.9997 | 0.0846 | 0.4391 | 0.7663 | 0.0826 | 0.4034 |
| 0.7873 | 4.0 | 260 | 0.7915 | 0.4150 | 0.5046 | 0.7713 | 0.9996 | 0.0842 | 0.4299 | 0.7616 | 0.0824 | 0.4009 |
| 0.8031 | 4.3077 | 280 | 0.7805 | 0.4022 | 0.4928 | 0.7648 | 0.9998 | 0.0826 | 0.3960 | 0.7557 | 0.0811 | 0.3699 |
| 0.7881 | 4.6154 | 300 | 0.7791 | 0.4352 | 0.5265 | 0.7848 | 0.9995 | 0.0659 | 0.5142 | 0.7808 | 0.0645 | 0.4604 |
| 0.7883 | 4.9231 | 320 | 0.7777 | 0.4137 | 0.5038 | 0.7714 | 0.9998 | 0.0748 | 0.4368 | 0.7626 | 0.0734 | 0.4050 |
### Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"background",
"corm",
"damage"
] |
mujerry/segformer-b0-finetuned-cityscapes-1024-1024_corm |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-cityscapes-1024-1024_corm
This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-cityscapes-1024-1024](https://huggingface.co/nvidia/segformer-b0-finetuned-cityscapes-1024-1024) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0506
- Mean Iou: 0.9091
- Mean Accuracy: 0.9514
- Overall Accuracy: 0.9835
- Accuracy Background: 0.9977
- Accuracy Corm: 0.9256
- Accuracy Damage: 0.9308
- Iou Background: 0.9938
- Iou Corm: 0.8393
- Iou Damage: 0.8942
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Corm | Accuracy Damage | Iou Background | Iou Corm | Iou Damage |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:---------------:|:--------------:|:--------:|:----------:|
| 0.8589 | 0.6061 | 20 | 0.8411 | 0.4729 | 0.5967 | 0.8533 | 0.9785 | 0.5630 | 0.2486 | 0.9291 | 0.2623 | 0.2273 |
| 0.7143 | 1.2121 | 40 | 0.6737 | 0.6855 | 0.8154 | 0.9261 | 0.9743 | 0.7133 | 0.7586 | 0.9723 | 0.4438 | 0.6403 |
| 0.5785 | 1.8182 | 60 | 0.5154 | 0.7300 | 0.8494 | 0.9393 | 0.9707 | 0.6523 | 0.9253 | 0.9704 | 0.4854 | 0.7342 |
| 0.4449 | 2.4242 | 80 | 0.3898 | 0.8138 | 0.9106 | 0.9596 | 0.9778 | 0.8207 | 0.9333 | 0.9771 | 0.6594 | 0.8048 |
| 0.3313 | 3.0303 | 100 | 0.3061 | 0.8287 | 0.9246 | 0.9633 | 0.9820 | 0.9175 | 0.8743 | 0.9809 | 0.7007 | 0.8046 |
| 0.2799 | 3.6364 | 120 | 0.2367 | 0.8545 | 0.9230 | 0.9717 | 0.9907 | 0.8453 | 0.9331 | 0.9882 | 0.7423 | 0.8330 |
| 0.2482 | 4.2424 | 140 | 0.2088 | 0.8669 | 0.9301 | 0.9746 | 0.9927 | 0.8707 | 0.9269 | 0.9899 | 0.7653 | 0.8455 |
| 0.2134 | 4.8485 | 160 | 0.1841 | 0.8637 | 0.9343 | 0.9739 | 0.9939 | 0.9403 | 0.8687 | 0.9908 | 0.7666 | 0.8337 |
| 0.1695 | 5.4545 | 180 | 0.1585 | 0.8832 | 0.9405 | 0.9780 | 0.9943 | 0.9053 | 0.9221 | 0.9916 | 0.7954 | 0.8625 |
| 0.1581 | 6.0606 | 200 | 0.1414 | 0.8895 | 0.9410 | 0.9795 | 0.9959 | 0.8998 | 0.9273 | 0.9922 | 0.8051 | 0.8711 |
| 0.1413 | 6.6667 | 220 | 0.1253 | 0.8943 | 0.9430 | 0.9805 | 0.9966 | 0.9060 | 0.9263 | 0.9926 | 0.8141 | 0.8763 |
| 0.1125 | 7.2727 | 240 | 0.1138 | 0.8955 | 0.9453 | 0.9807 | 0.9965 | 0.9193 | 0.9203 | 0.9927 | 0.8169 | 0.8770 |
| 0.1195 | 7.8788 | 260 | 0.1124 | 0.8811 | 0.9411 | 0.9779 | 0.9967 | 0.9503 | 0.8763 | 0.9926 | 0.7970 | 0.8537 |
| 0.1032 | 8.4848 | 280 | 0.1049 | 0.8912 | 0.9457 | 0.9798 | 0.9964 | 0.9413 | 0.8994 | 0.9927 | 0.8117 | 0.8692 |
| 0.104 | 9.0909 | 300 | 0.0912 | 0.9013 | 0.9459 | 0.9819 | 0.9971 | 0.9070 | 0.9337 | 0.9929 | 0.8255 | 0.8856 |
| 0.0994 | 9.6970 | 320 | 0.0887 | 0.9022 | 0.9473 | 0.9820 | 0.9972 | 0.9172 | 0.9275 | 0.9930 | 0.8275 | 0.8860 |
| 0.0914 | 10.3030 | 340 | 0.0866 | 0.8999 | 0.9471 | 0.9815 | 0.9974 | 0.9278 | 0.9160 | 0.9930 | 0.8249 | 0.8819 |
| 0.0917 | 10.9091 | 360 | 0.0813 | 0.9032 | 0.9473 | 0.9822 | 0.9975 | 0.9166 | 0.9279 | 0.9930 | 0.8293 | 0.8873 |
| 0.0822 | 11.5152 | 380 | 0.0774 | 0.9038 | 0.9454 | 0.9825 | 0.9972 | 0.8900 | 0.9490 | 0.9932 | 0.8280 | 0.8903 |
| 0.078 | 12.1212 | 400 | 0.0766 | 0.9035 | 0.9488 | 0.9823 | 0.9973 | 0.9244 | 0.9247 | 0.9932 | 0.8301 | 0.8871 |
| 0.0782 | 12.7273 | 420 | 0.0739 | 0.9027 | 0.9490 | 0.9822 | 0.9974 | 0.9302 | 0.9195 | 0.9933 | 0.8293 | 0.8856 |
| 0.0759 | 13.3333 | 440 | 0.0715 | 0.9025 | 0.9487 | 0.9821 | 0.9975 | 0.9316 | 0.9170 | 0.9933 | 0.8292 | 0.8851 |
| 0.066 | 13.9394 | 460 | 0.0691 | 0.9059 | 0.9480 | 0.9828 | 0.9975 | 0.9089 | 0.9377 | 0.9934 | 0.8328 | 0.8915 |
| 0.0774 | 14.5455 | 480 | 0.0674 | 0.9059 | 0.9493 | 0.9827 | 0.9976 | 0.9237 | 0.9267 | 0.9933 | 0.8339 | 0.8904 |
| 0.0719 | 15.1515 | 500 | 0.0690 | 0.9034 | 0.9488 | 0.9823 | 0.9977 | 0.9305 | 0.9183 | 0.9933 | 0.8307 | 0.8862 |
| 0.0713 | 15.7576 | 520 | 0.0666 | 0.9003 | 0.9486 | 0.9817 | 0.9974 | 0.9368 | 0.9117 | 0.9935 | 0.8262 | 0.8814 |
| 0.0647 | 16.3636 | 540 | 0.0645 | 0.9033 | 0.9498 | 0.9823 | 0.9974 | 0.9346 | 0.9173 | 0.9934 | 0.8307 | 0.8858 |
| 0.0576 | 16.9697 | 560 | 0.0637 | 0.9046 | 0.9499 | 0.9826 | 0.9975 | 0.9301 | 0.9221 | 0.9936 | 0.8323 | 0.8879 |
| 0.0598 | 17.5758 | 580 | 0.0625 | 0.9044 | 0.9501 | 0.9825 | 0.9974 | 0.9333 | 0.9197 | 0.9935 | 0.8321 | 0.8875 |
| 0.0676 | 18.1818 | 600 | 0.0644 | 0.8991 | 0.9489 | 0.9815 | 0.9974 | 0.9447 | 0.9047 | 0.9936 | 0.8243 | 0.8794 |
| 0.0474 | 18.7879 | 620 | 0.0624 | 0.9018 | 0.9503 | 0.9820 | 0.9973 | 0.9427 | 0.9108 | 0.9936 | 0.8285 | 0.8832 |
| 0.0611 | 19.3939 | 640 | 0.0606 | 0.9064 | 0.9504 | 0.9829 | 0.9976 | 0.9282 | 0.9254 | 0.9936 | 0.8350 | 0.8905 |
| 0.058 | 20.0 | 660 | 0.0596 | 0.9048 | 0.9508 | 0.9826 | 0.9973 | 0.9355 | 0.9197 | 0.9936 | 0.8330 | 0.8877 |
| 0.0574 | 20.6061 | 680 | 0.0575 | 0.9082 | 0.9484 | 0.9834 | 0.9973 | 0.8972 | 0.9507 | 0.9938 | 0.8356 | 0.8951 |
| 0.0562 | 21.2121 | 700 | 0.0576 | 0.9065 | 0.9465 | 0.9831 | 0.9973 | 0.8870 | 0.9553 | 0.9937 | 0.8319 | 0.8939 |
| 0.0551 | 21.8182 | 720 | 0.0571 | 0.9067 | 0.9515 | 0.9830 | 0.9972 | 0.9299 | 0.9274 | 0.9938 | 0.8361 | 0.8903 |
| 0.0498 | 22.4242 | 740 | 0.0564 | 0.9090 | 0.9509 | 0.9834 | 0.9976 | 0.9217 | 0.9336 | 0.9936 | 0.8388 | 0.8945 |
| 0.0566 | 23.0303 | 760 | 0.0554 | 0.9067 | 0.9511 | 0.9830 | 0.9975 | 0.9307 | 0.9251 | 0.9938 | 0.8359 | 0.8904 |
| 0.0436 | 23.6364 | 780 | 0.0567 | 0.9056 | 0.9509 | 0.9827 | 0.9976 | 0.9362 | 0.9191 | 0.9936 | 0.8344 | 0.8889 |
| 0.0586 | 24.2424 | 800 | 0.0548 | 0.9081 | 0.9515 | 0.9832 | 0.9974 | 0.9273 | 0.9296 | 0.9938 | 0.8380 | 0.8924 |
| 0.0497 | 24.8485 | 820 | 0.0549 | 0.9091 | 0.9511 | 0.9834 | 0.9976 | 0.9227 | 0.9331 | 0.9937 | 0.8390 | 0.8946 |
| 0.0535 | 25.4545 | 840 | 0.0544 | 0.9073 | 0.9510 | 0.9831 | 0.9976 | 0.9296 | 0.9256 | 0.9937 | 0.8368 | 0.8913 |
| 0.0514 | 26.0606 | 860 | 0.0539 | 0.9096 | 0.9514 | 0.9836 | 0.9975 | 0.9205 | 0.9362 | 0.9938 | 0.8399 | 0.8953 |
| 0.0684 | 26.6667 | 880 | 0.0550 | 0.9055 | 0.9511 | 0.9827 | 0.9976 | 0.9383 | 0.9174 | 0.9937 | 0.8344 | 0.8884 |
| 0.0542 | 27.2727 | 900 | 0.0524 | 0.9100 | 0.9512 | 0.9836 | 0.9976 | 0.9196 | 0.9364 | 0.9938 | 0.8403 | 0.8959 |
| 0.0455 | 27.8788 | 920 | 0.0534 | 0.9083 | 0.9518 | 0.9833 | 0.9975 | 0.9296 | 0.9282 | 0.9938 | 0.8384 | 0.8929 |
| 0.0512 | 28.4848 | 940 | 0.0525 | 0.9095 | 0.9504 | 0.9836 | 0.9977 | 0.9149 | 0.9386 | 0.9938 | 0.8393 | 0.8954 |
| 0.0486 | 29.0909 | 960 | 0.0524 | 0.9083 | 0.9516 | 0.9833 | 0.9976 | 0.9292 | 0.9280 | 0.9938 | 0.8383 | 0.8927 |
| 0.0486 | 29.6970 | 980 | 0.0517 | 0.9099 | 0.9509 | 0.9836 | 0.9976 | 0.9172 | 0.9380 | 0.9938 | 0.8401 | 0.8958 |
| 0.0388 | 30.3030 | 1000 | 0.0514 | 0.9101 | 0.9510 | 0.9837 | 0.9975 | 0.9147 | 0.9408 | 0.9938 | 0.8402 | 0.8962 |
| 0.0571 | 30.9091 | 1020 | 0.0518 | 0.9091 | 0.9513 | 0.9834 | 0.9977 | 0.9247 | 0.9314 | 0.9938 | 0.8392 | 0.8942 |
| 0.0515 | 31.5152 | 1040 | 0.0511 | 0.9103 | 0.9510 | 0.9837 | 0.9975 | 0.9135 | 0.9418 | 0.9939 | 0.8404 | 0.8966 |
| 0.049 | 32.1212 | 1060 | 0.0517 | 0.9100 | 0.9510 | 0.9836 | 0.9976 | 0.9171 | 0.9385 | 0.9938 | 0.8402 | 0.8958 |
| 0.0533 | 32.7273 | 1080 | 0.0513 | 0.9095 | 0.9514 | 0.9835 | 0.9975 | 0.9221 | 0.9346 | 0.9938 | 0.8398 | 0.8949 |
| 0.0443 | 33.3333 | 1100 | 0.0513 | 0.9092 | 0.9513 | 0.9835 | 0.9977 | 0.9245 | 0.9317 | 0.9938 | 0.8395 | 0.8944 |
| 0.0573 | 33.9394 | 1120 | 0.0516 | 0.9089 | 0.9515 | 0.9834 | 0.9976 | 0.9270 | 0.9297 | 0.9938 | 0.8390 | 0.8938 |
| 0.0421 | 34.5455 | 1140 | 0.0516 | 0.9082 | 0.9512 | 0.9833 | 0.9977 | 0.9294 | 0.9264 | 0.9938 | 0.8382 | 0.8927 |
| 0.0509 | 35.1515 | 1160 | 0.0503 | 0.9102 | 0.9508 | 0.9837 | 0.9976 | 0.9145 | 0.9403 | 0.9938 | 0.8403 | 0.8966 |
| 0.0854 | 35.7576 | 1180 | 0.0511 | 0.9087 | 0.9518 | 0.9834 | 0.9975 | 0.9285 | 0.9293 | 0.9938 | 0.8388 | 0.8934 |
| 0.0522 | 36.3636 | 1200 | 0.0508 | 0.9089 | 0.9516 | 0.9834 | 0.9976 | 0.9269 | 0.9302 | 0.9938 | 0.8392 | 0.8938 |
| 0.0648 | 36.9697 | 1220 | 0.0503 | 0.9103 | 0.9514 | 0.9837 | 0.9975 | 0.9175 | 0.9391 | 0.9939 | 0.8408 | 0.8964 |
| 0.0513 | 37.5758 | 1240 | 0.0502 | 0.9099 | 0.9511 | 0.9836 | 0.9977 | 0.9203 | 0.9353 | 0.9938 | 0.8402 | 0.8957 |
| 0.0494 | 38.1818 | 1260 | 0.0512 | 0.9093 | 0.9516 | 0.9835 | 0.9976 | 0.9257 | 0.9316 | 0.9938 | 0.8396 | 0.8944 |
| 0.0513 | 38.7879 | 1280 | 0.0510 | 0.9096 | 0.9517 | 0.9836 | 0.9975 | 0.9232 | 0.9343 | 0.9939 | 0.8400 | 0.8949 |
| 0.0573 | 39.3939 | 1300 | 0.0508 | 0.9092 | 0.9514 | 0.9835 | 0.9976 | 0.9249 | 0.9318 | 0.9938 | 0.8395 | 0.8943 |
| 0.0627 | 40.0 | 1320 | 0.0506 | 0.9091 | 0.9514 | 0.9835 | 0.9977 | 0.9256 | 0.9308 | 0.9938 | 0.8393 | 0.8942 |
### Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"background",
"corm",
"damage"
] |
mujerry/segformer-b2-finetuned-ade-512-512_corm |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b2-finetuned-ade-512-512_corm
This model is a fine-tuned version of [nvidia/segformer-b2-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b2-finetuned-ade-512-512) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0415
- Mean Iou: 0.9264
- Mean Accuracy: 0.9599
- Overall Accuracy: 0.9860
- Accuracy Background: 0.9978
- Accuracy Corm: 0.9362
- Accuracy Damage: 0.9456
- Iou Background: 0.9942
- Iou Corm: 0.8799
- Iou Damage: 0.9052
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Corm | Accuracy Damage | Iou Background | Iou Corm | Iou Damage |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:---------------:|:--------------:|:--------:|:----------:|
| 0.8746 | 0.9524 | 20 | 0.8170 | 0.4637 | 0.6489 | 0.8404 | 0.9133 | 0.0390 | 0.9944 | 0.9132 | 0.0327 | 0.4451 |
| 0.61 | 1.9048 | 40 | 0.4500 | 0.7608 | 0.8748 | 0.9451 | 0.9731 | 0.6946 | 0.9566 | 0.9730 | 0.6023 | 0.7071 |
| 0.3681 | 2.8571 | 60 | 0.2802 | 0.8597 | 0.9314 | 0.9711 | 0.9879 | 0.8621 | 0.9443 | 0.9873 | 0.7716 | 0.8201 |
| 0.2433 | 3.8095 | 80 | 0.2201 | 0.8866 | 0.9456 | 0.9774 | 0.9916 | 0.9159 | 0.9293 | 0.9904 | 0.8181 | 0.8513 |
| 0.1669 | 4.7619 | 100 | 0.1607 | 0.8891 | 0.9431 | 0.9783 | 0.9930 | 0.8723 | 0.9640 | 0.9915 | 0.8178 | 0.8580 |
| 0.1484 | 5.7143 | 120 | 0.1250 | 0.9017 | 0.9489 | 0.9812 | 0.9964 | 0.9423 | 0.9080 | 0.9932 | 0.8432 | 0.8688 |
| 0.1126 | 6.6667 | 140 | 0.1024 | 0.9092 | 0.9524 | 0.9827 | 0.9962 | 0.9247 | 0.9363 | 0.9934 | 0.8539 | 0.8804 |
| 0.0909 | 7.6190 | 160 | 0.0932 | 0.9017 | 0.9492 | 0.9813 | 0.9968 | 0.9563 | 0.8944 | 0.9935 | 0.8445 | 0.8670 |
| 0.0994 | 8.5714 | 180 | 0.0803 | 0.9122 | 0.9527 | 0.9833 | 0.9967 | 0.9118 | 0.9495 | 0.9936 | 0.8568 | 0.8861 |
| 0.0768 | 9.5238 | 200 | 0.0716 | 0.9147 | 0.9533 | 0.9838 | 0.9975 | 0.9247 | 0.9376 | 0.9937 | 0.8615 | 0.8889 |
| 0.0749 | 10.4762 | 220 | 0.0671 | 0.9177 | 0.9550 | 0.9844 | 0.9973 | 0.9191 | 0.9487 | 0.9939 | 0.8661 | 0.8932 |
| 0.0663 | 11.4286 | 240 | 0.0668 | 0.9097 | 0.9528 | 0.9829 | 0.9973 | 0.9528 | 0.9083 | 0.9939 | 0.8558 | 0.8795 |
| 0.0725 | 12.3810 | 260 | 0.0608 | 0.9189 | 0.9554 | 0.9847 | 0.9974 | 0.9123 | 0.9564 | 0.9940 | 0.8677 | 0.8951 |
| 0.0594 | 13.3333 | 280 | 0.0588 | 0.9167 | 0.9533 | 0.9843 | 0.9975 | 0.9000 | 0.9625 | 0.9940 | 0.8622 | 0.8940 |
| 0.062 | 14.2857 | 300 | 0.0552 | 0.9201 | 0.9565 | 0.9849 | 0.9972 | 0.9170 | 0.9553 | 0.9941 | 0.8691 | 0.8970 |
| 0.0535 | 15.2381 | 320 | 0.0543 | 0.9195 | 0.9559 | 0.9848 | 0.9972 | 0.9078 | 0.9626 | 0.9942 | 0.8683 | 0.8962 |
| 0.0555 | 16.1905 | 340 | 0.0517 | 0.9212 | 0.9566 | 0.9851 | 0.9973 | 0.9113 | 0.9612 | 0.9942 | 0.8704 | 0.8990 |
| 0.0553 | 17.1429 | 360 | 0.0513 | 0.9198 | 0.9553 | 0.9849 | 0.9975 | 0.9047 | 0.9638 | 0.9942 | 0.8679 | 0.8974 |
| 0.0572 | 18.0952 | 380 | 0.0501 | 0.9219 | 0.9563 | 0.9853 | 0.9977 | 0.9108 | 0.9603 | 0.9942 | 0.8713 | 0.9002 |
| 0.0503 | 19.0476 | 400 | 0.0483 | 0.9245 | 0.9573 | 0.9856 | 0.9981 | 0.9212 | 0.9525 | 0.9940 | 0.8757 | 0.9037 |
| 0.0539 | 20.0 | 420 | 0.0474 | 0.9245 | 0.9593 | 0.9857 | 0.9974 | 0.9309 | 0.9497 | 0.9942 | 0.8769 | 0.9024 |
| 0.0542 | 20.9524 | 440 | 0.0484 | 0.9202 | 0.9575 | 0.9849 | 0.9978 | 0.9511 | 0.9235 | 0.9941 | 0.8718 | 0.8949 |
| 0.033 | 21.9048 | 460 | 0.0478 | 0.9209 | 0.9576 | 0.9850 | 0.9977 | 0.9464 | 0.9287 | 0.9941 | 0.8726 | 0.8961 |
| 0.0421 | 22.8571 | 480 | 0.0452 | 0.9247 | 0.9591 | 0.9857 | 0.9974 | 0.9244 | 0.9555 | 0.9942 | 0.8766 | 0.9033 |
| 0.0472 | 23.8095 | 500 | 0.0455 | 0.9243 | 0.9583 | 0.9857 | 0.9976 | 0.9231 | 0.9543 | 0.9942 | 0.8759 | 0.9028 |
| 0.0381 | 24.7619 | 520 | 0.0456 | 0.9233 | 0.9570 | 0.9855 | 0.9977 | 0.9109 | 0.9625 | 0.9942 | 0.8732 | 0.9026 |
| 0.0486 | 25.7143 | 540 | 0.0444 | 0.9249 | 0.9593 | 0.9857 | 0.9978 | 0.9408 | 0.9394 | 0.9941 | 0.8780 | 0.9026 |
| 0.0501 | 26.6667 | 560 | 0.0458 | 0.9208 | 0.9579 | 0.9850 | 0.9977 | 0.9508 | 0.9252 | 0.9942 | 0.8725 | 0.8957 |
| 0.0343 | 27.6190 | 580 | 0.0436 | 0.9251 | 0.9594 | 0.9857 | 0.9978 | 0.9413 | 0.9391 | 0.9941 | 0.8782 | 0.9031 |
| 0.0407 | 28.5714 | 600 | 0.0434 | 0.9251 | 0.9597 | 0.9858 | 0.9977 | 0.9416 | 0.9396 | 0.9942 | 0.8784 | 0.9028 |
| 0.0419 | 29.5238 | 620 | 0.0445 | 0.9221 | 0.9586 | 0.9852 | 0.9977 | 0.9496 | 0.9285 | 0.9942 | 0.8743 | 0.8978 |
| 0.0506 | 30.4762 | 640 | 0.0425 | 0.9262 | 0.9593 | 0.9860 | 0.9978 | 0.9311 | 0.9491 | 0.9942 | 0.8791 | 0.9053 |
| 0.0422 | 31.4286 | 660 | 0.0424 | 0.9262 | 0.9595 | 0.9860 | 0.9977 | 0.9267 | 0.9540 | 0.9942 | 0.8790 | 0.9054 |
| 0.0362 | 32.3810 | 680 | 0.0425 | 0.9258 | 0.9600 | 0.9859 | 0.9977 | 0.9402 | 0.9421 | 0.9942 | 0.8793 | 0.9039 |
| 0.0437 | 33.3333 | 700 | 0.0424 | 0.9262 | 0.9599 | 0.9860 | 0.9978 | 0.9377 | 0.9441 | 0.9942 | 0.8796 | 0.9047 |
| 0.0363 | 34.2857 | 720 | 0.0415 | 0.9264 | 0.9602 | 0.9860 | 0.9976 | 0.9367 | 0.9463 | 0.9942 | 0.8800 | 0.9049 |
| 0.039 | 35.2381 | 740 | 0.0421 | 0.9267 | 0.9596 | 0.9861 | 0.9978 | 0.9290 | 0.9521 | 0.9942 | 0.8798 | 0.9060 |
| 0.0425 | 36.1905 | 760 | 0.0418 | 0.9259 | 0.9598 | 0.9859 | 0.9978 | 0.9391 | 0.9426 | 0.9942 | 0.8794 | 0.9040 |
| 0.0462 | 37.1429 | 780 | 0.0417 | 0.9267 | 0.9600 | 0.9861 | 0.9976 | 0.9311 | 0.9513 | 0.9942 | 0.8801 | 0.9057 |
| 0.0466 | 38.0952 | 800 | 0.0416 | 0.9261 | 0.9599 | 0.9860 | 0.9978 | 0.9392 | 0.9427 | 0.9942 | 0.8795 | 0.9045 |
| 0.0428 | 39.0476 | 820 | 0.0414 | 0.9266 | 0.9598 | 0.9861 | 0.9978 | 0.9323 | 0.9494 | 0.9942 | 0.8800 | 0.9057 |
| 0.04 | 40.0 | 840 | 0.0415 | 0.9264 | 0.9599 | 0.9860 | 0.9978 | 0.9362 | 0.9456 | 0.9942 | 0.8799 | 0.9052 |
### Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"background",
"corm",
"damage"
] |
EPFL-ECEO/segformer-b2-finetuned-coralscapes-1024-1024 |
# Model Card for Model ID
SegFormer model with a MiT-B2 backbone fine-tuned on Coralscapes at resolution 1024x1024, as introduced in [The Coralscapes Dataset: Semantic Scene Understanding in Coral Reefs](https://arxiv.org/abs/2503.20000).
## Model Details
### Model Description
- **Model type:** SegFormer
- **Finetuned from model:** [SegFormer (b2-sized) encoder pre-trained-only (`nvidia/mit-b2`)](https://huggingface.co/nvidia/mit-b2)
### Model Sources
- **Repository:** [coralscapesScripts](https://github.com/eceo-epfl/coralscapesScripts/)
- **Demo** [Hugging Face Spaces](https://huggingface.co/spaces/EPFL-ECEO/coralscapes_demo):
## How to Get Started with the Model
The simplest way to use this model to segment an image of the Coralscapes dataset is as follows:
```python
from transformers import SegformerImageProcessor, SegformerForSemanticSegmentation
from PIL import Image
from datasets import load_dataset
# Load an image from the coralscapes dataset or load your own image
dataset = load_dataset("EPFL-ECEO/coralscapes")
image = dataset["test"][42]["image"]
preprocessor = SegformerImageProcessor.from_pretrained("EPFL-ECEO/segformer-b2-finetuned-coralscapes-1024-1024")
model = SegformerForSemanticSegmentation.from_pretrained("EPFL-ECEO/segformer-b2-finetuned-coralscapes-1024-1024")
inputs = preprocessor(image, return_tensors = "pt")
outputs = model(**inputs)
outputs = preprocessor.post_process_semantic_segmentation(outputs, target_sizes=[(image.size[1], image.size[0])])
label_pred = outputs[0].numpy()
```
While using the above approach should still work for images of different sizes and scales, for images that are not close to the training size of the model (1024x1024),
we recommend using the following approach using a sliding window to achieve better results:
```python
import torch
import torch.nn.functional as F
from transformers import SegformerImageProcessor, SegformerForSemanticSegmentation
from PIL import Image
import numpy as np
from datasets import load_dataset
device = 'cuda' if torch.cuda.is_available() else 'cpu'
def resize_image(image, target_size=1024):
"""
Used to resize the image such that the smaller side equals 1024
"""
h_img, w_img = image.size
if h_img < w_img:
new_h, new_w = target_size, int(w_img * (target_size / h_img))
else:
new_h, new_w = int(h_img * (target_size / w_img)), target_size
resized_img = image.resize((new_h, new_w))
return resized_img
def segment_image(image, preprocessor, model, crop_size = (1024, 1024), num_classes = 40, transform=None):
"""
Finds an optimal stride based on the image size and aspect ratio to create
overlapping sliding windows of size 1024x1024 which are then fed into the model.
"""
h_crop, w_crop = crop_size
img = torch.Tensor(np.array(resize_image(image, target_size=1024)).transpose(2, 0, 1)).unsqueeze(0)
batch_size, _, h_img, w_img = img.size()
if transform:
img = torch.Tensor(transform(image = img.numpy())["image"]).to(device)
h_grids = int(np.round(3/2*h_img/h_crop)) if h_img > h_crop else 1
w_grids = int(np.round(3/2*w_img/w_crop)) if w_img > w_crop else 1
h_stride = int((h_img - h_crop + h_grids -1)/(h_grids -1)) if h_grids > 1 else h_crop
w_stride = int((w_img - w_crop + w_grids -1)/(w_grids -1)) if w_grids > 1 else w_crop
preds = img.new_zeros((batch_size, num_classes, h_img, w_img))
count_mat = img.new_zeros((batch_size, 1, h_img, w_img))
for h_idx in range(h_grids):
for w_idx in range(w_grids):
y1 = h_idx * h_stride
x1 = w_idx * w_stride
y2 = min(y1 + h_crop, h_img)
x2 = min(x1 + w_crop, w_img)
y1 = max(y2 - h_crop, 0)
x1 = max(x2 - w_crop, 0)
crop_img = img[:, :, y1:y2, x1:x2]
with torch.no_grad():
if(preprocessor):
inputs = preprocessor(crop_img, return_tensors = "pt")
inputs["pixel_values"] = inputs["pixel_values"].to(device)
else:
inputs = crop_img.to(device)
outputs = model(**inputs)
resized_logits = F.interpolate(
outputs.logits[0].unsqueeze(dim=0), size=crop_img.shape[-2:], mode="bilinear", align_corners=False
)
preds += F.pad(resized_logits,
(int(x1), int(preds.shape[3] - x2), int(y1),
int(preds.shape[2] - y2))).cpu()
count_mat[:, :, y1:y2, x1:x2] += 1
assert (count_mat == 0).sum() == 0
preds = preds / count_mat
preds = preds.argmax(dim=1)
preds = F.interpolate(preds.unsqueeze(0).type(torch.uint8), size=image.size[::-1], mode='nearest')
label_pred = preds.squeeze().cpu().numpy()
return label_pred
# Load an image from the coralscapes dataset or load your own image
dataset = load_dataset("EPFL-ECEO/coralscapes")
image = dataset["test"][42]["image"]
preprocessor = SegformerImageProcessor.from_pretrained("EPFL-ECEO/segformer-b2-finetuned-coralscapes-1024-1024")
model = SegformerForSemanticSegmentation.from_pretrained("EPFL-ECEO/segformer-b2-finetuned-coralscapes-1024-1024")
label_pred = segment_image(image, preprocessor, model)
```
## Training & Evaluation Details
### Data
The model is trained and evaluated on the [Coralscapes dataset](https://huggingface.co/datasets/EPFL-ECEO/coralscapes) which is a general-purpose dense semantic segmentation dataset for coral reefs.
### Procedure
Training is conducted following the Segformer original [implementation](https://proceedings.neurips.cc/paper_files/paper/2021/file/64f1f27bf1b4ec22924fd0acb550c235-Paper.pdf), using a batch size of 8 for 265 epochs,
using the AdamW optimizer with an initial learning rate of 6e-5, weight decay of 1e-2 and polynomial learning rate scheduler with a power of 1.
During training, images are randomly scaled within a range of 1 and 2, flipped horizontally with a 0.5 probability and randomly cropped to 1024×1024 pixels.
Input images are normalized using the ImageNet mean and standard deviation. For evaluation, a non-overlapping sliding window strategy is employed,
using a window size of 1024x1024.
### Results
- Test Accuracy: 80.904
- Test Mean IoU: 54.682
## Citation
If you find this project useful, please consider citing:
```bibtex
@misc{sauder2025coralscapesdatasetsemanticscene,
title={The Coralscapes Dataset: Semantic Scene Understanding in Coral Reefs},
author={Jonathan Sauder and Viktor Domazetoski and Guilhem Banc-Prandi and Gabriela Perna and Anders Meibom and Devis Tuia},
year={2025},
eprint={2503.20000},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2503.20000},
}
``` | [
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"10",
"11",
"12",
"13",
"14",
"15",
"16",
"17",
"18",
"19",
"20",
"21",
"22",
"23",
"24",
"25",
"26",
"27",
"28",
"29",
"30",
"31",
"32",
"33",
"34",
"35",
"36",
"37",
"38",
"39"
] |
mujerry/segformer-b4-finetuned-ade-512-512_corm |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b4-finetuned-ade-512-512_corm
This model is a fine-tuned version of [nvidia/segformer-b4-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b4-finetuned-ade-512-512) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0391
- Mean Iou: 0.9261
- Mean Accuracy: 0.9594
- Overall Accuracy: 0.9863
- Accuracy Background: 0.9977
- Accuracy Corm: 0.9268
- Accuracy Damage: 0.9537
- Iou Background: 0.9944
- Iou Corm: 0.8758
- Iou Damage: 0.9082
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Corm | Accuracy Damage | Iou Background | Iou Corm | Iou Damage |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:---------------:|:--------------:|:--------:|:----------:|
| 1.0242 | 0.6061 | 20 | 1.0187 | 0.3316 | 0.6012 | 0.5573 | 0.5345 | 0.5453 | 0.7237 | 0.5341 | 0.1224 | 0.3383 |
| 0.7686 | 1.2121 | 40 | 0.6913 | 0.7059 | 0.8624 | 0.9170 | 0.9369 | 0.7070 | 0.9434 | 0.9369 | 0.4718 | 0.7090 |
| 0.5281 | 1.8182 | 60 | 0.4782 | 0.8060 | 0.9165 | 0.9537 | 0.9697 | 0.8764 | 0.9032 | 0.9696 | 0.6651 | 0.7833 |
| 0.3931 | 2.4242 | 80 | 0.3279 | 0.8530 | 0.9308 | 0.9690 | 0.9843 | 0.8578 | 0.9503 | 0.9837 | 0.7547 | 0.8206 |
| 0.2574 | 3.0303 | 100 | 0.2112 | 0.8733 | 0.9335 | 0.9753 | 0.9915 | 0.8406 | 0.9685 | 0.9899 | 0.7898 | 0.8402 |
| 0.2112 | 3.6364 | 120 | 0.1588 | 0.8990 | 0.9450 | 0.9807 | 0.9952 | 0.8824 | 0.9576 | 0.9918 | 0.8337 | 0.8716 |
| 0.1545 | 4.2424 | 140 | 0.1198 | 0.8960 | 0.9398 | 0.9805 | 0.9965 | 0.8539 | 0.9690 | 0.9924 | 0.8245 | 0.8711 |
| 0.1127 | 4.8485 | 160 | 0.1152 | 0.8851 | 0.9395 | 0.9782 | 0.9973 | 0.9609 | 0.8604 | 0.9923 | 0.8191 | 0.8440 |
| 0.1147 | 5.4545 | 180 | 0.0862 | 0.9130 | 0.9546 | 0.9834 | 0.9956 | 0.9170 | 0.9513 | 0.9930 | 0.8579 | 0.8881 |
| 0.0945 | 6.0606 | 200 | 0.0793 | 0.9083 | 0.9457 | 0.9829 | 0.9977 | 0.8728 | 0.9667 | 0.9929 | 0.8437 | 0.8881 |
| 0.0942 | 6.6667 | 220 | 0.0730 | 0.9170 | 0.9519 | 0.9842 | 0.9985 | 0.9263 | 0.9309 | 0.9926 | 0.8624 | 0.8960 |
| 0.0766 | 7.2727 | 240 | 0.0675 | 0.9185 | 0.9565 | 0.9847 | 0.9972 | 0.9365 | 0.9358 | 0.9936 | 0.8656 | 0.8963 |
| 0.0674 | 7.8788 | 260 | 0.0635 | 0.9160 | 0.9523 | 0.9844 | 0.9972 | 0.8897 | 0.9700 | 0.9937 | 0.8585 | 0.8957 |
| 0.0662 | 8.4848 | 280 | 0.0593 | 0.9199 | 0.9520 | 0.9849 | 0.9985 | 0.8984 | 0.9590 | 0.9931 | 0.8637 | 0.9030 |
| 0.0683 | 9.0909 | 300 | 0.0582 | 0.9176 | 0.9516 | 0.9847 | 0.9981 | 0.8914 | 0.9652 | 0.9937 | 0.8598 | 0.8994 |
| 0.0591 | 9.6970 | 320 | 0.0548 | 0.9222 | 0.9565 | 0.9855 | 0.9974 | 0.9076 | 0.9644 | 0.9941 | 0.8693 | 0.9031 |
| 0.0649 | 10.3030 | 340 | 0.0541 | 0.9201 | 0.9553 | 0.9849 | 0.9984 | 0.9407 | 0.9269 | 0.9935 | 0.8685 | 0.8982 |
| 0.0622 | 10.9091 | 360 | 0.0525 | 0.9155 | 0.9497 | 0.9844 | 0.9982 | 0.8803 | 0.9707 | 0.9938 | 0.8551 | 0.8976 |
| 0.0637 | 11.5152 | 380 | 0.0542 | 0.9124 | 0.9466 | 0.9838 | 0.9986 | 0.8720 | 0.9692 | 0.9933 | 0.8491 | 0.8948 |
| 0.0647 | 12.1212 | 400 | 0.0486 | 0.9244 | 0.9562 | 0.9857 | 0.9988 | 0.9340 | 0.9358 | 0.9933 | 0.8738 | 0.9060 |
| 0.0418 | 12.7273 | 420 | 0.0466 | 0.9267 | 0.9589 | 0.9862 | 0.9980 | 0.9288 | 0.9500 | 0.9939 | 0.8773 | 0.9088 |
| 0.0459 | 13.3333 | 440 | 0.0460 | 0.9260 | 0.9601 | 0.9862 | 0.9977 | 0.9378 | 0.9448 | 0.9942 | 0.8770 | 0.9069 |
| 0.048 | 13.9394 | 460 | 0.0453 | 0.9253 | 0.9586 | 0.9861 | 0.9981 | 0.9335 | 0.9443 | 0.9941 | 0.8751 | 0.9067 |
| 0.0397 | 14.5455 | 480 | 0.0446 | 0.9263 | 0.9589 | 0.9863 | 0.9978 | 0.9219 | 0.9569 | 0.9943 | 0.8759 | 0.9088 |
| 0.0546 | 15.1515 | 500 | 0.0457 | 0.9219 | 0.9572 | 0.9854 | 0.9983 | 0.9447 | 0.9285 | 0.9940 | 0.8707 | 0.9010 |
| 0.0427 | 15.7576 | 520 | 0.0432 | 0.9267 | 0.9611 | 0.9863 | 0.9973 | 0.9374 | 0.9485 | 0.9943 | 0.8774 | 0.9084 |
| 0.0463 | 16.3636 | 540 | 0.0424 | 0.9263 | 0.9576 | 0.9863 | 0.9982 | 0.9148 | 0.9598 | 0.9941 | 0.8750 | 0.9098 |
| 0.048 | 16.9697 | 560 | 0.0421 | 0.9272 | 0.9588 | 0.9865 | 0.9981 | 0.9229 | 0.9555 | 0.9942 | 0.8776 | 0.9099 |
| 0.0534 | 17.5758 | 580 | 0.0420 | 0.9269 | 0.9610 | 0.9863 | 0.9975 | 0.9393 | 0.9460 | 0.9943 | 0.8777 | 0.9085 |
| 0.0411 | 18.1818 | 600 | 0.0420 | 0.9249 | 0.9581 | 0.9861 | 0.9976 | 0.9133 | 0.9634 | 0.9944 | 0.8734 | 0.9068 |
| 0.0417 | 18.7879 | 620 | 0.0425 | 0.9231 | 0.9573 | 0.9858 | 0.9973 | 0.9042 | 0.9705 | 0.9945 | 0.8699 | 0.9048 |
| 0.0368 | 19.3939 | 640 | 0.0405 | 0.9283 | 0.9603 | 0.9867 | 0.9979 | 0.9281 | 0.9548 | 0.9944 | 0.8795 | 0.9111 |
| 0.048 | 20.0 | 660 | 0.0399 | 0.9279 | 0.9608 | 0.9866 | 0.9977 | 0.9325 | 0.9522 | 0.9944 | 0.8793 | 0.9100 |
| 0.0363 | 20.6061 | 680 | 0.0415 | 0.9255 | 0.9593 | 0.9861 | 0.9981 | 0.9421 | 0.9378 | 0.9943 | 0.8761 | 0.9061 |
| 0.0459 | 21.2121 | 700 | 0.0421 | 0.9243 | 0.9583 | 0.9858 | 0.9984 | 0.9471 | 0.9294 | 0.9940 | 0.8748 | 0.9042 |
| 0.0436 | 21.8182 | 720 | 0.0403 | 0.9269 | 0.9610 | 0.9863 | 0.9976 | 0.9426 | 0.9429 | 0.9943 | 0.8782 | 0.9080 |
| 0.0461 | 22.4242 | 740 | 0.0406 | 0.9260 | 0.9615 | 0.9862 | 0.9974 | 0.9476 | 0.9396 | 0.9945 | 0.8771 | 0.9065 |
| 0.0319 | 23.0303 | 760 | 0.0395 | 0.9269 | 0.9614 | 0.9864 | 0.9970 | 0.9306 | 0.9567 | 0.9944 | 0.8780 | 0.9082 |
| 0.0366 | 23.6364 | 780 | 0.0392 | 0.9277 | 0.9607 | 0.9866 | 0.9978 | 0.9352 | 0.9492 | 0.9944 | 0.8793 | 0.9095 |
| 0.0351 | 24.2424 | 800 | 0.0390 | 0.9282 | 0.9605 | 0.9866 | 0.9979 | 0.9338 | 0.9497 | 0.9943 | 0.8796 | 0.9106 |
| 0.0322 | 24.8485 | 820 | 0.0388 | 0.9280 | 0.9600 | 0.9866 | 0.9980 | 0.9289 | 0.9531 | 0.9944 | 0.8790 | 0.9108 |
| 0.0346 | 25.4545 | 840 | 0.0392 | 0.9266 | 0.9595 | 0.9864 | 0.9976 | 0.9173 | 0.9635 | 0.9944 | 0.8760 | 0.9095 |
| 0.0342 | 26.0606 | 860 | 0.0398 | 0.9243 | 0.9575 | 0.9861 | 0.9976 | 0.9059 | 0.9691 | 0.9945 | 0.8715 | 0.9070 |
| 0.0389 | 26.6667 | 880 | 0.0387 | 0.9275 | 0.9617 | 0.9865 | 0.9970 | 0.9272 | 0.9609 | 0.9945 | 0.8793 | 0.9087 |
| 0.033 | 27.2727 | 900 | 0.0392 | 0.9272 | 0.9596 | 0.9865 | 0.9976 | 0.9168 | 0.9645 | 0.9944 | 0.8772 | 0.9099 |
| 0.0316 | 27.8788 | 920 | 0.0388 | 0.9269 | 0.9602 | 0.9865 | 0.9973 | 0.9196 | 0.9636 | 0.9945 | 0.8768 | 0.9095 |
| 0.0391 | 28.4848 | 940 | 0.0396 | 0.9262 | 0.9604 | 0.9863 | 0.9970 | 0.9200 | 0.9642 | 0.9945 | 0.8760 | 0.9082 |
| 0.0305 | 29.0909 | 960 | 0.0386 | 0.9275 | 0.9600 | 0.9865 | 0.9978 | 0.9241 | 0.9580 | 0.9945 | 0.8780 | 0.9101 |
| 0.034 | 29.6970 | 980 | 0.0392 | 0.9267 | 0.9600 | 0.9863 | 0.9980 | 0.9399 | 0.9421 | 0.9943 | 0.8777 | 0.9081 |
| 0.0322 | 30.3030 | 1000 | 0.0383 | 0.9275 | 0.9607 | 0.9865 | 0.9976 | 0.9321 | 0.9524 | 0.9945 | 0.8786 | 0.9094 |
| 0.0288 | 30.9091 | 1020 | 0.0389 | 0.9271 | 0.9606 | 0.9864 | 0.9976 | 0.9339 | 0.9504 | 0.9944 | 0.8782 | 0.9087 |
| 0.0324 | 31.5152 | 1040 | 0.0394 | 0.9265 | 0.9601 | 0.9863 | 0.9978 | 0.9362 | 0.9462 | 0.9944 | 0.8770 | 0.9080 |
| 0.0329 | 32.1212 | 1060 | 0.0399 | 0.9259 | 0.9599 | 0.9862 | 0.9980 | 0.9421 | 0.9396 | 0.9943 | 0.8767 | 0.9068 |
| 0.0211 | 32.7273 | 1080 | 0.0390 | 0.9268 | 0.9593 | 0.9864 | 0.9981 | 0.9310 | 0.9490 | 0.9943 | 0.8773 | 0.9087 |
| 0.0227 | 33.3333 | 1100 | 0.0389 | 0.9269 | 0.9581 | 0.9864 | 0.9983 | 0.9194 | 0.9565 | 0.9941 | 0.8764 | 0.9101 |
| 0.0328 | 33.9394 | 1120 | 0.0391 | 0.9270 | 0.9587 | 0.9864 | 0.9983 | 0.9284 | 0.9494 | 0.9941 | 0.8773 | 0.9096 |
| 0.0297 | 34.5455 | 1140 | 0.0389 | 0.9267 | 0.9597 | 0.9864 | 0.9979 | 0.9304 | 0.9509 | 0.9944 | 0.8771 | 0.9087 |
| 0.0346 | 35.1515 | 1160 | 0.0390 | 0.9267 | 0.9595 | 0.9864 | 0.9979 | 0.9292 | 0.9516 | 0.9943 | 0.8769 | 0.9088 |
| 0.0231 | 35.7576 | 1180 | 0.0391 | 0.9266 | 0.9587 | 0.9863 | 0.9981 | 0.9232 | 0.9547 | 0.9942 | 0.8764 | 0.9093 |
| 0.0301 | 36.3636 | 1200 | 0.0387 | 0.9267 | 0.9594 | 0.9864 | 0.9978 | 0.9232 | 0.9572 | 0.9944 | 0.8764 | 0.9093 |
| 0.0331 | 36.9697 | 1220 | 0.0388 | 0.9269 | 0.9597 | 0.9864 | 0.9979 | 0.9290 | 0.9522 | 0.9943 | 0.8772 | 0.9091 |
| 0.0281 | 37.5758 | 1240 | 0.0389 | 0.9268 | 0.9589 | 0.9864 | 0.9981 | 0.9266 | 0.9520 | 0.9943 | 0.8769 | 0.9093 |
| 0.0208 | 38.1818 | 1260 | 0.0390 | 0.9266 | 0.9605 | 0.9863 | 0.9975 | 0.9318 | 0.9523 | 0.9944 | 0.8768 | 0.9086 |
| 0.0348 | 38.7879 | 1280 | 0.0397 | 0.9257 | 0.9598 | 0.9862 | 0.9978 | 0.9387 | 0.9429 | 0.9943 | 0.8760 | 0.9068 |
| 0.0276 | 39.3939 | 1300 | 0.0388 | 0.9269 | 0.9590 | 0.9864 | 0.9981 | 0.9267 | 0.9522 | 0.9942 | 0.8772 | 0.9093 |
| 0.0286 | 40.0 | 1320 | 0.0395 | 0.9248 | 0.9572 | 0.9861 | 0.9979 | 0.9082 | 0.9655 | 0.9944 | 0.8723 | 0.9076 |
| 0.0298 | 40.6061 | 1340 | 0.0391 | 0.9263 | 0.9592 | 0.9863 | 0.9977 | 0.9212 | 0.9587 | 0.9944 | 0.8759 | 0.9087 |
| 0.0235 | 41.2121 | 1360 | 0.0389 | 0.9262 | 0.9590 | 0.9863 | 0.9978 | 0.9220 | 0.9574 | 0.9944 | 0.8757 | 0.9085 |
| 0.0223 | 41.8182 | 1380 | 0.0392 | 0.9265 | 0.9593 | 0.9863 | 0.9979 | 0.9273 | 0.9528 | 0.9943 | 0.8765 | 0.9088 |
| 0.0216 | 42.4242 | 1400 | 0.0390 | 0.9264 | 0.9592 | 0.9863 | 0.9979 | 0.9265 | 0.9531 | 0.9943 | 0.8761 | 0.9086 |
| 0.027 | 43.0303 | 1420 | 0.0395 | 0.9264 | 0.9601 | 0.9863 | 0.9976 | 0.9315 | 0.9511 | 0.9944 | 0.8764 | 0.9083 |
| 0.0261 | 43.6364 | 1440 | 0.0392 | 0.9265 | 0.9593 | 0.9863 | 0.9980 | 0.9309 | 0.9491 | 0.9943 | 0.8768 | 0.9083 |
| 0.0213 | 44.2424 | 1460 | 0.0390 | 0.9262 | 0.9592 | 0.9863 | 0.9977 | 0.9213 | 0.9584 | 0.9944 | 0.8756 | 0.9087 |
| 0.0228 | 44.8485 | 1480 | 0.0390 | 0.9260 | 0.9596 | 0.9863 | 0.9976 | 0.9240 | 0.9571 | 0.9944 | 0.8756 | 0.9081 |
| 0.0317 | 45.4545 | 1500 | 0.0391 | 0.9259 | 0.9584 | 0.9862 | 0.9980 | 0.9227 | 0.9544 | 0.9942 | 0.8751 | 0.9082 |
| 0.0244 | 46.0606 | 1520 | 0.0392 | 0.9261 | 0.9591 | 0.9863 | 0.9979 | 0.9274 | 0.9521 | 0.9943 | 0.8759 | 0.9081 |
| 0.0282 | 46.6667 | 1540 | 0.0391 | 0.9259 | 0.9589 | 0.9862 | 0.9978 | 0.9224 | 0.9564 | 0.9944 | 0.8750 | 0.9082 |
| 0.0244 | 47.2727 | 1560 | 0.0397 | 0.9262 | 0.9592 | 0.9863 | 0.9979 | 0.9280 | 0.9519 | 0.9944 | 0.8760 | 0.9081 |
| 0.0265 | 47.8788 | 1580 | 0.0393 | 0.9258 | 0.9592 | 0.9862 | 0.9977 | 0.9226 | 0.9572 | 0.9944 | 0.8751 | 0.9080 |
| 0.0282 | 48.4848 | 1600 | 0.0394 | 0.9260 | 0.9585 | 0.9863 | 0.9980 | 0.9209 | 0.9567 | 0.9943 | 0.8752 | 0.9084 |
| 0.0229 | 49.0909 | 1620 | 0.0390 | 0.9262 | 0.9592 | 0.9863 | 0.9979 | 0.9281 | 0.9517 | 0.9943 | 0.8760 | 0.9083 |
| 0.0236 | 49.6970 | 1640 | 0.0391 | 0.9261 | 0.9594 | 0.9863 | 0.9977 | 0.9268 | 0.9537 | 0.9944 | 0.8758 | 0.9082 |
### Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"background",
"corm",
"damage"
] |
digscom/table-transformer-page-segmentation-floorplan |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# table-transformer-page-segmentation-floorplan
This model was trained from scratch on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50.0
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"layer_non_floorplan",
"layer_floorplan"
] |
Supritha09/segformer-b0-finetuned-segments-sidewalk-oct-22 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-segments-sidewalk-oct-22
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7416
- Mean Iou: 0.2291
- Mean Accuracy: 0.2745
- Overall Accuracy: 0.8148
- Accuracy Unlabeled: nan
- Accuracy Flat-road: 0.8989
- Accuracy Flat-sidewalk: 0.9334
- Accuracy Flat-crosswalk: 0.3286
- Accuracy Flat-cyclinglane: 0.7157
- Accuracy Flat-parkingdriveway: 0.4025
- Accuracy Flat-railtrack: nan
- Accuracy Flat-curb: 0.2777
- Accuracy Human-person: 0.0070
- Accuracy Human-rider: 0.0
- Accuracy Vehicle-car: 0.9165
- Accuracy Vehicle-truck: 0.0
- Accuracy Vehicle-bus: 0.0
- Accuracy Vehicle-tramtrain: nan
- Accuracy Vehicle-motorcycle: 0.0
- Accuracy Vehicle-bicycle: 0.0015
- Accuracy Vehicle-caravan: 0.0
- Accuracy Vehicle-cartrailer: 0.0
- Accuracy Construction-building: 0.9119
- Accuracy Construction-door: 0.0
- Accuracy Construction-wall: 0.3250
- Accuracy Construction-fenceguardrail: 0.0193
- Accuracy Construction-bridge: 0.0
- Accuracy Construction-tunnel: nan
- Accuracy Construction-stairs: 0.0
- Accuracy Object-pole: 0.0010
- Accuracy Object-trafficsign: 0.0
- Accuracy Object-trafficlight: 0.0
- Accuracy Nature-vegetation: 0.9432
- Accuracy Nature-terrain: 0.8601
- Accuracy Sky: 0.9549
- Accuracy Void-ground: 0.0
- Accuracy Void-dynamic: 0.0
- Accuracy Void-static: 0.0118
- Accuracy Void-unclear: 0.0
- Iou Unlabeled: nan
- Iou Flat-road: 0.6950
- Iou Flat-sidewalk: 0.8358
- Iou Flat-crosswalk: 0.3244
- Iou Flat-cyclinglane: 0.6194
- Iou Flat-parkingdriveway: 0.2588
- Iou Flat-railtrack: nan
- Iou Flat-curb: 0.2398
- Iou Human-person: 0.0070
- Iou Human-rider: 0.0
- Iou Vehicle-car: 0.7419
- Iou Vehicle-truck: 0.0
- Iou Vehicle-bus: 0.0
- Iou Vehicle-tramtrain: nan
- Iou Vehicle-motorcycle: 0.0
- Iou Vehicle-bicycle: 0.0015
- Iou Vehicle-caravan: 0.0
- Iou Vehicle-cartrailer: 0.0
- Iou Construction-building: 0.6416
- Iou Construction-door: 0.0
- Iou Construction-wall: 0.2560
- Iou Construction-fenceguardrail: 0.0193
- Iou Construction-bridge: 0.0
- Iou Construction-tunnel: nan
- Iou Construction-stairs: 0.0
- Iou Object-pole: 0.0010
- Iou Object-trafficsign: 0.0
- Iou Object-trafficlight: 0.0
- Iou Nature-vegetation: 0.8168
- Iou Nature-terrain: 0.7483
- Iou Sky: 0.8861
- Iou Void-ground: 0.0
- Iou Void-dynamic: 0.0
- Iou Void-static: 0.0109
- Iou Void-unclear: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:|
| 2.891 | 0.2 | 20 | 3.1216 | 0.0951 | 0.1603 | 0.6183 | nan | 0.4194 | 0.8606 | 0.0001 | 0.4368 | 0.0017 | nan | 0.0086 | 0.0 | 0.0 | 0.6356 | 0.0 | 0.0 | nan | 0.0 | 0.0250 | 0.0 | 0.0 | 0.7514 | 0.0 | 0.0262 | 0.0019 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9113 | 0.0001 | 0.8849 | 0.0005 | 0.0045 | 0.0 | 0.0 | 0.0 | 0.3415 | 0.6805 | 0.0001 | 0.1711 | 0.0017 | 0.0 | 0.0074 | 0.0 | 0.0 | 0.4352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0163 | 0.0 | 0.0 | 0.4522 | 0.0 | 0.0230 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0010 | 0.0 | 0.0 | 0.5877 | 0.0001 | 0.6065 | 0.0005 | 0.0019 | 0.0 | 0.0 |
| 2.3264 | 0.4 | 40 | 2.2665 | 0.1128 | 0.1686 | 0.6600 | nan | 0.7301 | 0.8661 | 0.0 | 0.0848 | 0.0001 | nan | 0.0000 | 0.0 | 0.0 | 0.7658 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7971 | 0.0 | 0.0035 | 0.0001 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9315 | 0.1447 | 0.9028 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.4192 | 0.6898 | 0.0 | 0.0764 | 0.0001 | nan | 0.0000 | 0.0 | 0.0 | 0.4590 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.4907 | 0.0 | 0.0034 | 0.0001 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6540 | 0.1423 | 0.6736 | 0.0 | 0.0000 | 0.0 | 0.0 |
| 1.9157 | 0.6 | 60 | 1.9516 | 0.1325 | 0.1843 | 0.6839 | nan | 0.7909 | 0.8541 | 0.0 | 0.0396 | 0.0002 | nan | 0.0000 | 0.0 | 0.0 | 0.7718 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8330 | 0.0 | 0.0005 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9406 | 0.5619 | 0.9207 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4525 | 0.7067 | 0.0 | 0.0362 | 0.0002 | nan | 0.0000 | 0.0 | 0.0 | 0.5311 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.4823 | 0.0 | 0.0005 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6938 | 0.5098 | 0.6948 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8405 | 0.8 | 80 | 1.7340 | 0.1407 | 0.1888 | 0.7053 | nan | 0.8148 | 0.8969 | 0.0 | 0.0510 | 0.0004 | nan | 0.0 | 0.0 | 0.0 | 0.8138 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8464 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9380 | 0.5830 | 0.9080 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4845 | 0.7304 | 0.0 | 0.0484 | 0.0004 | nan | 0.0 | 0.0 | 0.0 | 0.5936 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5084 | 0.0 | 0.0003 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7069 | 0.5392 | 0.7494 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6934 | 1.0 | 100 | 1.5898 | 0.1520 | 0.1996 | 0.7254 | nan | 0.7606 | 0.9301 | 0.0 | 0.1821 | 0.0011 | nan | 0.0 | 0.0 | 0.0 | 0.8730 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8520 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9284 | 0.7616 | 0.8976 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5321 | 0.7299 | 0.0 | 0.1717 | 0.0011 | nan | 0.0 | 0.0 | 0.0 | 0.5806 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5302 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7315 | 0.6628 | 0.7735 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7481 | 1.2 | 120 | 1.5167 | 0.1484 | 0.1982 | 0.7251 | nan | 0.8440 | 0.9206 | 0.0 | 0.0644 | 0.0006 | nan | 0.0 | 0.0 | 0.0 | 0.8973 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8191 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9283 | 0.7458 | 0.9235 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5165 | 0.7503 | 0.0 | 0.0627 | 0.0006 | nan | 0.0 | 0.0 | 0.0 | 0.5667 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5468 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7403 | 0.6554 | 0.7619 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3889 | 1.4 | 140 | 1.4462 | 0.1507 | 0.2014 | 0.7272 | nan | 0.8992 | 0.9043 | 0.0 | 0.0546 | 0.0016 | nan | 0.0 | 0.0 | 0.0 | 0.8818 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8365 | 0.0 | 0.0010 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9051 | 0.8422 | 0.9176 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5082 | 0.7587 | 0.0 | 0.0528 | 0.0016 | nan | 0.0 | 0.0 | 0.0 | 0.6098 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5474 | 0.0 | 0.0010 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7559 | 0.6651 | 0.7722 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.4823 | 1.6 | 160 | 1.3622 | 0.1626 | 0.2118 | 0.7409 | nan | 0.8999 | 0.8858 | 0.0 | 0.3655 | 0.0056 | nan | 0.0 | 0.0 | 0.0 | 0.8434 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8777 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9283 | 0.8646 | 0.8945 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5426 | 0.7635 | 0.0 | 0.3303 | 0.0056 | nan | 0.0 | 0.0 | 0.0 | 0.6279 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5397 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7482 | 0.6875 | 0.7965 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6763 | 1.8 | 180 | 1.2656 | 0.1637 | 0.2109 | 0.7469 | nan | 0.8304 | 0.9270 | 0.0 | 0.4478 | 0.0037 | nan | 0.0 | 0.0 | 0.0 | 0.8452 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8393 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9476 | 0.7676 | 0.9299 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5718 | 0.7625 | 0.0 | 0.3700 | 0.0037 | nan | 0.0 | 0.0 | 0.0 | 0.6451 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5502 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7327 | 0.6711 | 0.7673 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1681 | 2.0 | 200 | 1.2474 | 0.1659 | 0.2148 | 0.7536 | nan | 0.8419 | 0.9321 | 0.0 | 0.4820 | 0.0035 | nan | 0.0 | 0.0 | 0.0 | 0.8932 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8270 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9421 | 0.8015 | 0.9358 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5855 | 0.7748 | 0.0 | 0.3927 | 0.0035 | nan | 0.0 | 0.0 | 0.0 | 0.6156 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5531 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7535 | 0.6955 | 0.7681 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.2264 | 2.2 | 220 | 1.2089 | 0.1708 | 0.2170 | 0.7595 | nan | 0.8759 | 0.9230 | 0.0 | 0.5296 | 0.0126 | nan | 0.0 | 0.0 | 0.0 | 0.9001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8810 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9444 | 0.7591 | 0.9003 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5883 | 0.7836 | 0.0 | 0.4233 | 0.0124 | nan | 0.0 | 0.0 | 0.0 | 0.6433 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5721 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7561 | 0.6792 | 0.8357 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3107 | 2.4 | 240 | 1.1461 | 0.1708 | 0.2179 | 0.7617 | nan | 0.8651 | 0.9399 | 0.0 | 0.4766 | 0.0143 | nan | 0.0 | 0.0 | 0.0 | 0.9046 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8429 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9341 | 0.8365 | 0.9417 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5969 | 0.7763 | 0.0 | 0.4213 | 0.0139 | nan | 0.0 | 0.0 | 0.0 | 0.6364 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5682 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7708 | 0.7134 | 0.7979 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.9876 | 2.6 | 260 | 1.1106 | 0.1742 | 0.2189 | 0.7660 | nan | 0.8686 | 0.9407 | 0.0 | 0.4912 | 0.0173 | nan | 0.0 | 0.0 | 0.0 | 0.8871 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8980 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9302 | 0.8646 | 0.8897 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6102 | 0.7810 | 0.0 | 0.4237 | 0.0167 | nan | 0.0 | 0.0 | 0.0 | 0.6702 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5725 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7719 | 0.7128 | 0.8401 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3372 | 2.8 | 280 | 1.1123 | 0.1744 | 0.2215 | 0.7613 | nan | 0.9173 | 0.9002 | 0.0 | 0.5475 | 0.0324 | nan | 0.0 | 0.0 | 0.0 | 0.8878 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8935 | 0.0 | 0.0009 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9384 | 0.8193 | 0.9301 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5548 | 0.7972 | 0.0 | 0.4622 | 0.0307 | nan | 0.0 | 0.0 | 0.0 | 0.6641 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5703 | 0.0 | 0.0009 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7717 | 0.7114 | 0.8425 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.5488 | 3.0 | 300 | 1.0372 | 0.1756 | 0.2208 | 0.7680 | nan | 0.9115 | 0.9291 | 0.0 | 0.5073 | 0.0389 | nan | 0.0 | 0.0 | 0.0 | 0.8606 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8975 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9342 | 0.8394 | 0.9267 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6024 | 0.7977 | 0.0 | 0.4650 | 0.0370 | nan | 0.0 | 0.0 | 0.0 | 0.6831 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5479 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7773 | 0.7216 | 0.8106 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1759 | 3.2 | 320 | 1.0484 | 0.1785 | 0.2267 | 0.7702 | nan | 0.8946 | 0.9124 | 0.0 | 0.6714 | 0.0525 | nan | 0.0000 | 0.0 | 0.0 | 0.8762 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8826 | 0.0 | 0.0054 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9438 | 0.8703 | 0.9188 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5967 | 0.8033 | 0.0 | 0.4787 | 0.0477 | nan | 0.0000 | 0.0 | 0.0 | 0.6821 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5846 | 0.0 | 0.0054 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7683 | 0.7150 | 0.8520 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8541 | 3.4 | 340 | 1.0273 | 0.1770 | 0.2243 | 0.7659 | nan | 0.9237 | 0.9099 | 0.0 | 0.5054 | 0.0866 | nan | 0.0 | 0.0 | 0.0 | 0.9191 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8701 | 0.0 | 0.0081 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9372 | 0.8427 | 0.9505 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5690 | 0.7960 | 0.0 | 0.4673 | 0.0751 | nan | 0.0 | 0.0 | 0.0 | 0.6575 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5798 | 0.0 | 0.0081 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7825 | 0.7274 | 0.8230 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.0464 | 3.6 | 360 | 0.9769 | 0.1833 | 0.2289 | 0.7768 | nan | 0.8998 | 0.9292 | 0.0 | 0.6006 | 0.1114 | nan | 0.0000 | 0.0 | 0.0 | 0.8880 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8924 | 0.0 | 0.0368 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9428 | 0.8622 | 0.9326 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6203 | 0.8026 | 0.0 | 0.5209 | 0.0920 | nan | 0.0000 | 0.0 | 0.0 | 0.6911 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5814 | 0.0 | 0.0367 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7761 | 0.7166 | 0.8453 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8219 | 3.8 | 380 | 0.9641 | 0.1859 | 0.2309 | 0.7789 | nan | 0.9159 | 0.9311 | 0.0 | 0.6214 | 0.1352 | nan | 0.0002 | 0.0 | 0.0 | 0.8860 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9157 | 0.0 | 0.0507 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9143 | 0.8529 | 0.9340 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6242 | 0.8045 | 0.0 | 0.5343 | 0.1098 | nan | 0.0002 | 0.0 | 0.0 | 0.6954 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5730 | 0.0 | 0.0503 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7967 | 0.7210 | 0.8526 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1492 | 4.0 | 400 | 0.9505 | 0.1851 | 0.2339 | 0.7758 | nan | 0.8427 | 0.9268 | 0.0 | 0.6710 | 0.1980 | nan | 0.0006 | 0.0 | 0.0 | 0.9116 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8880 | 0.0 | 0.0566 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9362 | 0.8778 | 0.9423 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6045 | 0.8093 | 0.0 | 0.4687 | 0.1445 | nan | 0.0006 | 0.0 | 0.0 | 0.6872 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5974 | 0.0 | 0.0559 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7857 | 0.7209 | 0.8626 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1718 | 4.2 | 420 | 0.9337 | 0.1873 | 0.2341 | 0.7788 | nan | 0.8707 | 0.9244 | 0.0 | 0.6817 | 0.2303 | nan | 0.0017 | 0.0 | 0.0 | 0.8943 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9001 | 0.0 | 0.0617 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9526 | 0.7929 | 0.9453 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6195 | 0.8149 | 0.0 | 0.5256 | 0.1628 | nan | 0.0017 | 0.0 | 0.0 | 0.6922 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5890 | 0.0 | 0.0592 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7731 | 0.7023 | 0.8672 | 0.0 | 0.0 | 0.0000 | 0.0 |
| 0.9714 | 4.4 | 440 | 0.9085 | 0.1907 | 0.2385 | 0.7822 | nan | 0.9118 | 0.9157 | 0.0 | 0.6726 | 0.2666 | nan | 0.0008 | 0.0 | 0.0 | 0.9173 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8887 | 0.0 | 0.0716 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9339 | 0.8666 | 0.9472 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6122 | 0.8149 | 0.0 | 0.5714 | 0.1727 | nan | 0.0008 | 0.0 | 0.0 | 0.6815 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6039 | 0.0 | 0.0694 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7931 | 0.7353 | 0.8570 | 0.0 | 0.0 | 0.0001 | 0.0 |
| 1.0207 | 4.6 | 460 | 0.9086 | 0.1926 | 0.2388 | 0.7808 | nan | 0.9324 | 0.8997 | 0.0 | 0.6603 | 0.2709 | nan | 0.0039 | 0.0 | 0.0 | 0.8732 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9122 | 0.0 | 0.0977 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9461 | 0.8705 | 0.9368 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6062 | 0.8141 | 0.0 | 0.5818 | 0.1757 | nan | 0.0038 | 0.0 | 0.0 | 0.7139 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5908 | 0.0 | 0.0930 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7864 | 0.7319 | 0.8730 | 0.0 | 0.0 | 0.0000 | 0.0 |
| 0.8914 | 4.8 | 480 | 0.8951 | 0.1940 | 0.2416 | 0.7858 | nan | 0.8807 | 0.9288 | 0.0000 | 0.6781 | 0.2767 | nan | 0.0095 | 0.0 | 0.0 | 0.9135 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9038 | 0.0 | 0.1437 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9274 | 0.8739 | 0.9529 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6264 | 0.8138 | 0.0000 | 0.5377 | 0.1911 | nan | 0.0094 | 0.0 | 0.0 | 0.7070 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6044 | 0.0 | 0.1329 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8004 | 0.7307 | 0.8598 | 0.0 | 0.0 | 0.0001 | 0.0 |
| 0.8834 | 5.0 | 500 | 0.8636 | 0.1973 | 0.2415 | 0.7882 | nan | 0.9236 | 0.9252 | 0.0 | 0.6543 | 0.2757 | nan | 0.0116 | 0.0 | 0.0 | 0.9042 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9141 | 0.0 | 0.1735 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9342 | 0.8266 | 0.9421 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.6188 | 0.8160 | 0.0 | 0.6008 | 0.1853 | nan | 0.0115 | 0.0 | 0.0 | 0.7110 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6058 | 0.0 | 0.1560 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8073 | 0.7327 | 0.8720 | 0.0 | 0.0 | 0.0000 | 0.0 |
| 0.8539 | 5.2 | 520 | 0.8547 | 0.1980 | 0.2445 | 0.7886 | nan | 0.9064 | 0.9203 | 0.0173 | 0.6894 | 0.2910 | nan | 0.0148 | 0.0 | 0.0 | 0.8957 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8979 | 0.0 | 0.1880 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9432 | 0.8700 | 0.9448 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6275 | 0.8157 | 0.0173 | 0.5806 | 0.1926 | nan | 0.0146 | 0.0 | 0.0 | 0.7156 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6143 | 0.0 | 0.1660 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7956 | 0.7322 | 0.8670 | 0.0 | 0.0 | 0.0001 | 0.0 |
| 0.9139 | 5.4 | 540 | 0.8471 | 0.2029 | 0.2511 | 0.7916 | nan | 0.8827 | 0.9204 | 0.0630 | 0.7090 | 0.3398 | nan | 0.0641 | 0.0 | 0.0 | 0.9035 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9139 | 0.0 | 0.2307 | 0.0001 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9325 | 0.8776 | 0.9454 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.6443 | 0.8214 | 0.0629 | 0.5436 | 0.2018 | nan | 0.0622 | 0.0 | 0.0 | 0.7210 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6143 | 0.0 | 0.1970 | 0.0001 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8101 | 0.7351 | 0.8745 | 0.0 | 0.0 | 0.0001 | 0.0 |
| 0.8528 | 5.6 | 560 | 0.8405 | 0.2037 | 0.2503 | 0.7923 | nan | 0.9042 | 0.9177 | 0.0283 | 0.6995 | 0.3188 | nan | 0.0753 | 0.0 | 0.0 | 0.9047 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9053 | 0.0 | 0.2489 | 0.0001 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9437 | 0.8607 | 0.9488 | 0.0 | 0.0 | 0.0025 | 0.0 | nan | 0.6417 | 0.8152 | 0.0283 | 0.5731 | 0.2017 | nan | 0.0730 | 0.0 | 0.0 | 0.7313 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6208 | 0.0 | 0.2107 | 0.0001 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8019 | 0.7364 | 0.8780 | 0.0 | 0.0 | 0.0025 | 0.0 |
| 0.872 | 5.8 | 580 | 0.8297 | 0.2054 | 0.2519 | 0.7960 | nan | 0.8919 | 0.9302 | 0.0523 | 0.6972 | 0.3636 | nan | 0.0677 | 0.0 | 0.0 | 0.9093 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9003 | 0.0 | 0.2434 | 0.0000 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9439 | 0.8505 | 0.9547 | 0.0 | 0.0 | 0.0034 | 0.0 | nan | 0.6557 | 0.8237 | 0.0523 | 0.5844 | 0.2215 | nan | 0.0653 | 0.0 | 0.0 | 0.7245 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6196 | 0.0 | 0.2076 | 0.0000 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7984 | 0.7397 | 0.8725 | 0.0 | 0.0 | 0.0034 | 0.0 |
| 0.8829 | 6.0 | 600 | 0.8210 | 0.2051 | 0.2515 | 0.7951 | nan | 0.8965 | 0.9251 | 0.0603 | 0.6944 | 0.3836 | nan | 0.0991 | 0.0004 | 0.0 | 0.9046 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9215 | 0.0 | 0.1654 | 0.0000 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9326 | 0.8709 | 0.9420 | 0.0 | 0.0 | 0.0006 | 0.0 | nan | 0.6568 | 0.8223 | 0.0603 | 0.5965 | 0.2284 | nan | 0.0939 | 0.0004 | 0.0 | 0.7225 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6042 | 0.0 | 0.1493 | 0.0000 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8074 | 0.7397 | 0.8773 | 0.0 | 0.0 | 0.0006 | 0.0 |
| 0.8738 | 6.2 | 620 | 0.8141 | 0.2082 | 0.2557 | 0.7974 | nan | 0.8904 | 0.9229 | 0.0874 | 0.7120 | 0.3749 | nan | 0.1194 | 0.0000 | 0.0 | 0.9044 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.8990 | 0.0 | 0.2487 | 0.0010 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9501 | 0.8641 | 0.9514 | 0.0 | 0.0 | 0.0020 | 0.0 | nan | 0.6574 | 0.8248 | 0.0874 | 0.5800 | 0.2353 | nan | 0.1120 | 0.0000 | 0.0 | 0.7273 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.6245 | 0.0 | 0.2083 | 0.0010 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.7950 | 0.7323 | 0.8678 | 0.0 | 0.0 | 0.0020 | 0.0 |
| 0.6292 | 6.4 | 640 | 0.8079 | 0.2079 | 0.2512 | 0.7994 | nan | 0.8854 | 0.9526 | 0.1002 | 0.6565 | 0.3216 | nan | 0.0882 | 0.0009 | 0.0 | 0.9166 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9071 | 0.0 | 0.2263 | 0.0007 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9237 | 0.8518 | 0.9523 | 0.0 | 0.0 | 0.0023 | 0.0 | nan | 0.6650 | 0.8156 | 0.1002 | 0.6094 | 0.2105 | nan | 0.0847 | 0.0009 | 0.0 | 0.7214 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6220 | 0.0 | 0.1959 | 0.0007 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8137 | 0.7339 | 0.8700 | 0.0 | 0.0 | 0.0022 | 0.0 |
| 0.7267 | 6.6 | 660 | 0.8046 | 0.2101 | 0.2569 | 0.7982 | nan | 0.8693 | 0.9297 | 0.0797 | 0.7006 | 0.4004 | nan | 0.1672 | 0.0006 | 0.0 | 0.9130 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9166 | 0.0 | 0.2455 | 0.0011 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.9442 | 0.8417 | 0.9490 | 0.0 | 0.0 | 0.0058 | 0.0 | nan | 0.6665 | 0.8273 | 0.0797 | 0.5924 | 0.2180 | nan | 0.1548 | 0.0006 | 0.0 | 0.7274 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6189 | 0.0 | 0.2043 | 0.0011 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.8091 | 0.7266 | 0.8815 | 0.0 | 0.0 | 0.0057 | 0.0 |
| 0.6611 | 6.8 | 680 | 0.7932 | 0.2130 | 0.2578 | 0.8019 | nan | 0.9229 | 0.9264 | 0.1598 | 0.6944 | 0.3422 | nan | 0.1357 | 0.0011 | 0.0 | 0.9068 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9202 | 0.0 | 0.2401 | 0.0025 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.9332 | 0.8495 | 0.9503 | 0.0 | 0.0 | 0.0058 | 0.0 | nan | 0.6562 | 0.8272 | 0.1597 | 0.6123 | 0.2261 | nan | 0.1247 | 0.0011 | 0.0 | 0.7324 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6204 | 0.0 | 0.1981 | 0.0025 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.8179 | 0.7415 | 0.8785 | 0.0 | 0.0 | 0.0056 | 0.0 |
| 0.6721 | 7.0 | 700 | 0.7952 | 0.2193 | 0.2698 | 0.8009 | nan | 0.8870 | 0.9010 | 0.3046 | 0.7463 | 0.4459 | nan | 0.2092 | 0.0014 | 0.0 | 0.9168 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9005 | 0.0 | 0.2701 | 0.0042 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9419 | 0.8750 | 0.9566 | 0.0 | 0.0 | 0.0020 | 0.0 | nan | 0.6773 | 0.8206 | 0.2988 | 0.5490 | 0.2396 | nan | 0.1894 | 0.0014 | 0.0 | 0.7234 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6377 | 0.0 | 0.2254 | 0.0042 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8072 | 0.7455 | 0.8755 | 0.0 | 0.0 | 0.0019 | 0.0 |
| 0.5819 | 7.2 | 720 | 0.7728 | 0.2222 | 0.2668 | 0.8094 | nan | 0.8958 | 0.9345 | 0.2916 | 0.7034 | 0.3836 | nan | 0.1924 | 0.0009 | 0.0 | 0.9042 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9203 | 0.0 | 0.2880 | 0.0021 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.9389 | 0.8602 | 0.9481 | 0.0 | 0.0 | 0.0051 | 0.0 | nan | 0.6939 | 0.8298 | 0.2880 | 0.6142 | 0.2385 | nan | 0.1759 | 0.0009 | 0.0 | 0.7341 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6193 | 0.0 | 0.2356 | 0.0021 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.8168 | 0.7499 | 0.8834 | 0.0 | 0.0 | 0.0050 | 0.0 |
| 0.6658 | 7.4 | 740 | 0.7738 | 0.2198 | 0.2659 | 0.8058 | nan | 0.9091 | 0.9203 | 0.2681 | 0.7243 | 0.3777 | nan | 0.1815 | 0.0014 | 0.0 | 0.9099 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9004 | 0.0 | 0.2930 | 0.0040 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9537 | 0.8315 | 0.9542 | 0.0 | 0.0 | 0.0122 | 0.0 | nan | 0.6808 | 0.8262 | 0.2653 | 0.5933 | 0.2418 | nan | 0.1658 | 0.0014 | 0.0 | 0.7310 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6331 | 0.0 | 0.2418 | 0.0040 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.8021 | 0.7367 | 0.8791 | 0.0 | 0.0 | 0.0118 | 0.0 |
| 0.5962 | 7.6 | 760 | 0.7682 | 0.2215 | 0.2673 | 0.8095 | nan | 0.8782 | 0.9431 | 0.2120 | 0.7039 | 0.3838 | nan | 0.2345 | 0.0019 | 0.0 | 0.9254 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9131 | 0.0 | 0.3310 | 0.0037 | 0.0 | nan | 0.0 | 0.0004 | 0.0 | 0.0 | 0.9288 | 0.8741 | 0.9474 | 0.0 | 0.0 | 0.0046 | 0.0 | nan | 0.6927 | 0.8289 | 0.2111 | 0.6077 | 0.2347 | nan | 0.2072 | 0.0019 | 0.0 | 0.7222 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.6409 | 0.0 | 0.2561 | 0.0037 | 0.0 | nan | 0.0 | 0.0004 | 0.0 | 0.0 | 0.8205 | 0.7491 | 0.8859 | 0.0 | 0.0 | 0.0044 | 0.0 |
| 0.6425 | 7.8 | 780 | 0.7661 | 0.2261 | 0.2729 | 0.8108 | nan | 0.8701 | 0.9341 | 0.2964 | 0.7135 | 0.4178 | nan | 0.2577 | 0.0012 | 0.0 | 0.9095 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9081 | 0.0 | 0.3501 | 0.0142 | 0.0 | nan | 0.0 | 0.0005 | 0.0 | 0.0 | 0.9396 | 0.8818 | 0.9557 | 0.0 | 0.0 | 0.0089 | 0.0 | nan | 0.6978 | 0.8311 | 0.2915 | 0.6089 | 0.2366 | nan | 0.2257 | 0.0012 | 0.0 | 0.7413 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.6369 | 0.0 | 0.2721 | 0.0142 | 0.0 | nan | 0.0 | 0.0005 | 0.0 | 0.0 | 0.8155 | 0.7437 | 0.8821 | 0.0 | 0.0 | 0.0086 | 0.0 |
| 0.5529 | 8.0 | 800 | 0.7591 | 0.2238 | 0.2684 | 0.8106 | nan | 0.9069 | 0.9317 | 0.2696 | 0.7076 | 0.3851 | nan | 0.2549 | 0.0020 | 0.0 | 0.9071 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.9131 | 0.0 | 0.2889 | 0.0082 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.9457 | 0.8403 | 0.9520 | 0.0 | 0.0 | 0.0078 | 0.0 | nan | 0.6896 | 0.8323 | 0.2676 | 0.6214 | 0.2450 | nan | 0.2222 | 0.0020 | 0.0 | 0.7380 | 0.0 | 0.0 | nan | 0.0 | 0.0000 | 0.0 | 0.0 | 0.6297 | 0.0 | 0.2351 | 0.0082 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.8138 | 0.7415 | 0.8847 | 0.0 | 0.0 | 0.0075 | 0.0 |
| 0.537 | 8.2 | 820 | 0.7602 | 0.2260 | 0.2716 | 0.8120 | nan | 0.9080 | 0.9297 | 0.2903 | 0.7007 | 0.3919 | nan | 0.2584 | 0.0050 | 0.0 | 0.9201 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.9038 | 0.0 | 0.3262 | 0.0109 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.9444 | 0.8671 | 0.9506 | 0.0 | 0.0 | 0.0118 | 0.0 | nan | 0.6891 | 0.8322 | 0.2873 | 0.6210 | 0.2465 | nan | 0.2238 | 0.0050 | 0.0 | 0.7299 | 0.0 | 0.0 | nan | 0.0 | 0.0001 | 0.0 | 0.0 | 0.6458 | 0.0 | 0.2569 | 0.0109 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.8130 | 0.7473 | 0.8864 | 0.0 | 0.0 | 0.0111 | 0.0 |
| 0.6042 | 8.4 | 840 | 0.7511 | 0.2280 | 0.2738 | 0.8135 | nan | 0.8814 | 0.9369 | 0.3271 | 0.7198 | 0.3999 | nan | 0.2873 | 0.0065 | 0.0 | 0.9200 | 0.0 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.9173 | 0.0 | 0.3226 | 0.0097 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.9353 | 0.8613 | 0.9507 | 0.0 | 0.0 | 0.0088 | 0.0 | nan | 0.6985 | 0.8327 | 0.3229 | 0.6096 | 0.2492 | nan | 0.2476 | 0.0065 | 0.0 | 0.7324 | 0.0 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.6384 | 0.0 | 0.2517 | 0.0097 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.8224 | 0.7486 | 0.8887 | 0.0 | 0.0 | 0.0083 | 0.0 |
| 1.131 | 8.6 | 860 | 0.7509 | 0.2289 | 0.2759 | 0.8127 | nan | 0.8743 | 0.9280 | 0.3798 | 0.7464 | 0.3959 | nan | 0.2912 | 0.0068 | 0.0 | 0.9126 | 0.0 | 0.0 | nan | 0.0 | 0.0016 | 0.0 | 0.0 | 0.9100 | 0.0 | 0.3119 | 0.0151 | 0.0 | nan | 0.0 | 0.0008 | 0.0 | 0.0 | 0.9445 | 0.8648 | 0.9584 | 0.0 | 0.0 | 0.0098 | 0.0 | nan | 0.6955 | 0.8325 | 0.3711 | 0.5880 | 0.2550 | nan | 0.2498 | 0.0068 | 0.0 | 0.7389 | 0.0 | 0.0 | nan | 0.0 | 0.0016 | 0.0 | 0.0 | 0.6369 | 0.0 | 0.2504 | 0.0150 | 0.0 | nan | 0.0 | 0.0008 | 0.0 | 0.0 | 0.8134 | 0.7454 | 0.8842 | 0.0 | 0.0 | 0.0093 | 0.0 |
| 0.6354 | 8.8 | 880 | 0.7528 | 0.2283 | 0.2731 | 0.8135 | nan | 0.9045 | 0.9313 | 0.3668 | 0.7101 | 0.4039 | nan | 0.2417 | 0.0084 | 0.0 | 0.9167 | 0.0 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.9122 | 0.0 | 0.2941 | 0.0145 | 0.0 | nan | 0.0 | 0.0011 | 0.0 | 0.0 | 0.9413 | 0.8466 | 0.9591 | 0.0 | 0.0 | 0.0118 | 0.0 | nan | 0.6954 | 0.8324 | 0.3614 | 0.6268 | 0.2533 | nan | 0.2141 | 0.0084 | 0.0 | 0.7368 | 0.0 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.6342 | 0.0 | 0.2400 | 0.0145 | 0.0 | nan | 0.0 | 0.0011 | 0.0 | 0.0 | 0.8160 | 0.7475 | 0.8840 | 0.0 | 0.0 | 0.0111 | 0.0 |
| 0.5614 | 9.0 | 900 | 0.7530 | 0.2297 | 0.2759 | 0.8142 | nan | 0.8974 | 0.9303 | 0.3799 | 0.7167 | 0.4055 | nan | 0.2530 | 0.0095 | 0.0 | 0.9206 | 0.0 | 0.0 | nan | 0.0 | 0.0008 | 0.0 | 0.0 | 0.9010 | 0.0 | 0.3152 | 0.0283 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9396 | 0.8818 | 0.9604 | 0.0 | 0.0 | 0.0111 | 0.0 | nan | 0.6982 | 0.8313 | 0.3738 | 0.6163 | 0.2556 | nan | 0.2234 | 0.0095 | 0.0 | 0.7375 | 0.0 | 0.0 | nan | 0.0 | 0.0008 | 0.0 | 0.0 | 0.6477 | 0.0 | 0.2519 | 0.0282 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.8117 | 0.7411 | 0.8825 | 0.0 | 0.0 | 0.0103 | 0.0 |
| 0.9059 | 9.2 | 920 | 0.7442 | 0.2279 | 0.2742 | 0.8137 | nan | 0.8918 | 0.9339 | 0.3089 | 0.7218 | 0.4054 | nan | 0.2675 | 0.0085 | 0.0 | 0.9221 | 0.0 | 0.0 | nan | 0.0 | 0.0012 | 0.0 | 0.0 | 0.9122 | 0.0 | 0.3199 | 0.0234 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9337 | 0.8793 | 0.9587 | 0.0 | 0.0 | 0.0100 | 0.0 | nan | 0.6932 | 0.8332 | 0.3048 | 0.6196 | 0.2571 | nan | 0.2340 | 0.0085 | 0.0 | 0.7372 | 0.0 | 0.0 | nan | 0.0 | 0.0012 | 0.0 | 0.0 | 0.6429 | 0.0 | 0.2524 | 0.0234 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.8193 | 0.7437 | 0.8853 | 0.0 | 0.0 | 0.0093 | 0.0 |
| 0.6598 | 9.4 | 940 | 0.7460 | 0.2305 | 0.2760 | 0.8150 | nan | 0.8950 | 0.9325 | 0.3731 | 0.7204 | 0.4041 | nan | 0.2699 | 0.0084 | 0.0 | 0.9192 | 0.0 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.9055 | 0.0 | 0.3261 | 0.0279 | 0.0 | nan | 0.0 | 0.0006 | 0.0 | 0.0 | 0.9429 | 0.8605 | 0.9568 | 0.0 | 0.0 | 0.0138 | 0.0 | nan | 0.6981 | 0.8321 | 0.3670 | 0.6163 | 0.2577 | nan | 0.2353 | 0.0084 | 0.0 | 0.7410 | 0.0 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.6469 | 0.0 | 0.2563 | 0.0278 | 0.0 | nan | 0.0 | 0.0006 | 0.0 | 0.0 | 0.8153 | 0.7441 | 0.8853 | 0.0 | 0.0 | 0.0127 | 0.0 |
| 0.6652 | 9.6 | 960 | 0.7475 | 0.2318 | 0.2773 | 0.8162 | nan | 0.8905 | 0.9337 | 0.3995 | 0.7226 | 0.4010 | nan | 0.2910 | 0.0075 | 0.0 | 0.9129 | 0.0 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.9082 | 0.0 | 0.3270 | 0.0228 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9445 | 0.8605 | 0.9580 | 0.0 | 0.0 | 0.0145 | 0.0 | nan | 0.7044 | 0.8347 | 0.3920 | 0.6129 | 0.2585 | nan | 0.2498 | 0.0075 | 0.0 | 0.7453 | 0.0 | 0.0 | nan | 0.0 | 0.0009 | 0.0 | 0.0 | 0.6432 | 0.0 | 0.2543 | 0.0227 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.8135 | 0.7463 | 0.8842 | 0.0 | 0.0 | 0.0134 | 0.0 |
| 0.7944 | 9.8 | 980 | 0.7423 | 0.2292 | 0.2755 | 0.8138 | nan | 0.8996 | 0.9281 | 0.3324 | 0.7189 | 0.4220 | nan | 0.2835 | 0.0071 | 0.0 | 0.9133 | 0.0 | 0.0 | nan | 0.0 | 0.0021 | 0.0 | 0.0 | 0.9172 | 0.0 | 0.3236 | 0.0180 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9378 | 0.8662 | 0.9570 | 0.0 | 0.0 | 0.0115 | 0.0 | nan | 0.6936 | 0.8353 | 0.3281 | 0.6170 | 0.2578 | nan | 0.2444 | 0.0071 | 0.0 | 0.7437 | 0.0 | 0.0 | nan | 0.0 | 0.0021 | 0.0 | 0.0 | 0.6382 | 0.0 | 0.2544 | 0.0180 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.8204 | 0.7496 | 0.8849 | 0.0 | 0.0 | 0.0107 | 0.0 |
| 0.7053 | 10.0 | 1000 | 0.7416 | 0.2291 | 0.2745 | 0.8148 | nan | 0.8989 | 0.9334 | 0.3286 | 0.7157 | 0.4025 | nan | 0.2777 | 0.0070 | 0.0 | 0.9165 | 0.0 | 0.0 | nan | 0.0 | 0.0015 | 0.0 | 0.0 | 0.9119 | 0.0 | 0.3250 | 0.0193 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.9432 | 0.8601 | 0.9549 | 0.0 | 0.0 | 0.0118 | 0.0 | nan | 0.6950 | 0.8358 | 0.3244 | 0.6194 | 0.2588 | nan | 0.2398 | 0.0070 | 0.0 | 0.7419 | 0.0 | 0.0 | nan | 0.0 | 0.0015 | 0.0 | 0.0 | 0.6416 | 0.0 | 0.2560 | 0.0193 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.8168 | 0.7483 | 0.8861 | 0.0 | 0.0 | 0.0109 | 0.0 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"unlabeled",
"flat-road",
"flat-sidewalk",
"flat-crosswalk",
"flat-cyclinglane",
"flat-parkingdriveway",
"flat-railtrack",
"flat-curb",
"human-person",
"human-rider",
"vehicle-car",
"vehicle-truck",
"vehicle-bus",
"vehicle-tramtrain",
"vehicle-motorcycle",
"vehicle-bicycle",
"vehicle-caravan",
"vehicle-cartrailer",
"construction-building",
"construction-door",
"construction-wall",
"construction-fenceguardrail",
"construction-bridge",
"construction-tunnel",
"construction-stairs",
"object-pole",
"object-trafficsign",
"object-trafficlight",
"nature-vegetation",
"nature-terrain",
"sky",
"void-ground",
"void-dynamic",
"void-static",
"void-unclear"
] |
andro-flock/b2-semantic-segment |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# b2-semantic-segment
This model is a fine-tuned version of [andro-flock/b2-classification](https://huggingface.co/andro-flock/b2-classification) on the andro-flock/semantic-segment2 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 2000
### Training results
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.4.0
- Tokenizers 0.21.0
| [
"background",
"skin",
"hair"
] |
yeray142/finetune-instance-segmentation-ade20k-mini-mask2former |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetune-instance-segmentation-ade20k-mini-mask2former
This model is a fine-tuned version of [facebook/mask2former-swin-tiny-coco-instance](https://huggingface.co/facebook/mask2former-swin-tiny-coco-instance) on the yeray142/kitti-mots-instance dataset.
It achieves the following results on the evaluation set:
- Loss: 21.4682
- Map: 0.2191
- Map 50: 0.4214
- Map 75: 0.2032
- Map Small: 0.1293
- Map Medium: 0.4299
- Map Large: 0.9458
- Mar 1: 0.0979
- Mar 10: 0.2731
- Mar 100: 0.3209
- Mar Small: 0.2542
- Mar Medium: 0.5212
- Mar Large: 0.9604
- Map Car: 0.406
- Mar 100 Car: 0.5312
- Map Person: 0.0323
- Mar 100 Person: 0.1106
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Car | Mar 100 Car | Map Person | Mar 100 Person |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:----------:|:--------------:|
| 32.2084 | 1.0 | 315 | 24.9517 | 0.1754 | 0.3241 | 0.1688 | 0.0901 | 0.3559 | 0.9036 | 0.0848 | 0.2249 | 0.2625 | 0.1937 | 0.4585 | 0.9383 | 0.3414 | 0.4833 | 0.0093 | 0.0417 |
| 26.4262 | 2.0 | 630 | 23.8438 | 0.1907 | 0.358 | 0.1793 | 0.1022 | 0.3764 | 0.9182 | 0.0904 | 0.2378 | 0.2851 | 0.2184 | 0.4779 | 0.9446 | 0.3657 | 0.5024 | 0.0157 | 0.0677 |
| 24.7264 | 3.0 | 945 | 22.7357 | 0.197 | 0.3715 | 0.189 | 0.1086 | 0.3803 | 0.9337 | 0.0912 | 0.2441 | 0.2901 | 0.2234 | 0.4819 | 0.9531 | 0.3769 | 0.5086 | 0.017 | 0.0716 |
| 23.7704 | 4.0 | 1260 | 22.5427 | 0.2001 | 0.3753 | 0.1878 | 0.1092 | 0.3902 | 0.9368 | 0.0924 | 0.2519 | 0.2994 | 0.2332 | 0.4914 | 0.9552 | 0.3791 | 0.513 | 0.0211 | 0.0858 |
| 22.7954 | 5.0 | 1575 | 22.0928 | 0.2071 | 0.3926 | 0.195 | 0.1184 | 0.4043 | 0.933 | 0.0961 | 0.2594 | 0.3075 | 0.2418 | 0.5028 | 0.9524 | 0.3906 | 0.5253 | 0.0237 | 0.0897 |
| 22.2719 | 6.0 | 1890 | 21.8539 | 0.2135 | 0.4034 | 0.1965 | 0.1216 | 0.4159 | 0.9446 | 0.0973 | 0.265 | 0.3128 | 0.2478 | 0.5031 | 0.9608 | 0.3985 | 0.5309 | 0.0285 | 0.0946 |
| 21.6338 | 7.0 | 2205 | 21.7856 | 0.2125 | 0.4048 | 0.1965 | 0.1201 | 0.4207 | 0.9388 | 0.0967 | 0.2641 | 0.3131 | 0.2466 | 0.5119 | 0.957 | 0.3956 | 0.524 | 0.0293 | 0.1023 |
| 21.3044 | 8.0 | 2520 | 21.4704 | 0.2152 | 0.4046 | 0.2003 | 0.1229 | 0.4233 | 0.9421 | 0.0983 | 0.2663 | 0.3149 | 0.2487 | 0.5109 | 0.9592 | 0.4002 | 0.5274 | 0.0301 | 0.1024 |
| 20.9003 | 9.0 | 2835 | 21.5561 | 0.2151 | 0.4079 | 0.1994 | 0.124 | 0.4264 | 0.946 | 0.0977 | 0.2678 | 0.3194 | 0.2535 | 0.5132 | 0.9598 | 0.3997 | 0.5286 | 0.0304 | 0.1102 |
| 20.5867 | 9.9698 | 3140 | 21.4682 | 0.2191 | 0.4214 | 0.2032 | 0.1293 | 0.4299 | 0.9458 | 0.0979 | 0.2731 | 0.3209 | 0.2542 | 0.5212 | 0.9604 | 0.406 | 0.5312 | 0.0323 | 0.1106 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"car",
"person"
] |
yeray142/finetune-instance-segmentation-ade20k-mini-mask2former_augmentation |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetune-instance-segmentation-ade20k-mini-mask2former_augmentation
This model is a fine-tuned version of [facebook/mask2former-swin-tiny-coco-instance](https://huggingface.co/facebook/mask2former-swin-tiny-coco-instance) on the yeray142/kitti-mots-instance dataset.
It achieves the following results on the evaluation set:
- Loss: 21.8491
- Map: 0.2024
- Map 50: 0.3976
- Map 75: 0.1846
- Map Small: 0.1131
- Map Medium: 0.4171
- Map Large: 0.9371
- Mar 1: 0.098
- Mar 10: 0.2621
- Mar 100: 0.3113
- Mar Small: 0.2456
- Mar Medium: 0.5068
- Mar Large: 0.9545
- Map Car: 0.3761
- Mar 100 Car: 0.5206
- Map Person: 0.0288
- Mar 100 Person: 0.102
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Car | Mar 100 Car | Map Person | Mar 100 Person |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:----------:|:--------------:|
| 32.1236 | 1.0 | 315 | 25.0016 | 0.172 | 0.3235 | 0.1677 | 0.0881 | 0.3483 | 0.8944 | 0.0877 | 0.2218 | 0.2617 | 0.1935 | 0.4549 | 0.9347 | 0.3376 | 0.4775 | 0.0065 | 0.0459 |
| 25.8006 | 2.0 | 630 | 23.8844 | 0.1836 | 0.3505 | 0.1708 | 0.0973 | 0.3691 | 0.9132 | 0.09 | 0.2324 | 0.276 | 0.2074 | 0.4765 | 0.9434 | 0.3541 | 0.4901 | 0.0131 | 0.0619 |
| 24.098 | 3.0 | 945 | 23.1822 | 0.1892 | 0.3583 | 0.1751 | 0.0975 | 0.3823 | 0.9297 | 0.0904 | 0.2392 | 0.283 | 0.215 | 0.4814 | 0.9495 | 0.3616 | 0.4967 | 0.0168 | 0.0693 |
| 23.0237 | 4.0 | 1260 | 22.7127 | 0.1913 | 0.3692 | 0.1751 | 0.1017 | 0.3846 | 0.9289 | 0.0933 | 0.2437 | 0.289 | 0.2225 | 0.4827 | 0.9486 | 0.3635 | 0.5003 | 0.0191 | 0.0778 |
| 22.25 | 5.0 | 1575 | 22.5918 | 0.1933 | 0.3765 | 0.1754 | 0.1053 | 0.3951 | 0.9267 | 0.0934 | 0.2477 | 0.2916 | 0.2253 | 0.4829 | 0.9474 | 0.3648 | 0.5 | 0.0218 | 0.0832 |
| 21.7056 | 6.0 | 1890 | 21.9666 | 0.2019 | 0.3913 | 0.1833 | 0.1101 | 0.4037 | 0.9311 | 0.0965 | 0.256 | 0.2998 | 0.235 | 0.4911 | 0.9497 | 0.3775 | 0.5145 | 0.0263 | 0.0852 |
| 21.218 | 7.0 | 2205 | 22.1376 | 0.2002 | 0.3859 | 0.1841 | 0.1087 | 0.412 | 0.9299 | 0.0974 | 0.255 | 0.3003 | 0.2331 | 0.5004 | 0.9524 | 0.3751 | 0.5113 | 0.0254 | 0.0892 |
| 20.7151 | 8.0 | 2520 | 21.7431 | 0.2013 | 0.3953 | 0.1819 | 0.1105 | 0.411 | 0.9349 | 0.0973 | 0.2595 | 0.3059 | 0.2401 | 0.5016 | 0.9533 | 0.375 | 0.5178 | 0.0277 | 0.094 |
| 20.4197 | 9.0 | 2835 | 21.8546 | 0.2024 | 0.3925 | 0.184 | 0.1112 | 0.4136 | 0.9325 | 0.0971 | 0.2589 | 0.3044 | 0.2387 | 0.4965 | 0.9531 | 0.3781 | 0.5164 | 0.0267 | 0.0925 |
| 20.1339 | 9.9698 | 3140 | 21.8491 | 0.2024 | 0.3976 | 0.1846 | 0.1131 | 0.4171 | 0.9371 | 0.098 | 0.2621 | 0.3113 | 0.2456 | 0.5068 | 0.9545 | 0.3761 | 0.5206 | 0.0288 | 0.102 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"car",
"person"
] |
yeray142/finetune-instance-segmentation-mini-mask2former_augmentation_default |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetune-instance-segmentation-mini-mask2former_augmentation_default
This model is a fine-tuned version of [facebook/mask2former-swin-tiny-coco-instance](https://huggingface.co/facebook/mask2former-swin-tiny-coco-instance) on the jsalavedra/strawberry_disease dataset.
It achieves the following results on the evaluation set:
- Loss: 16.2566
- Map: 0.6229
- Map 50: 0.8172
- Map 75: 0.6824
- Map Small: 0.35
- Map Medium: 0.401
- Map Large: 0.7146
- Mar 1: 0.4279
- Mar 10: 0.7886
- Mar 100: 0.831
- Mar Small: 0.5
- Mar Medium: 0.7108
- Mar Large: 0.8802
- Map Angular leafspot: 0.54
- Mar 100 Angular leafspot: 0.8135
- Map Anthracnose fruit rot: 0.405
- Mar 100 Anthracnose fruit rot: 0.7118
- Map Blossom blight: 0.7372
- Mar 100 Blossom blight: 0.8159
- Map Gray mold: 0.5767
- Mar 100 Gray mold: 0.7648
- Map Leaf spot: 0.8783
- Mar 100 Leaf spot: 0.9416
- Map Powdery mildew fruit: 0.5019
- Mar 100 Powdery mildew fruit: 0.8833
- Map Powdery mildew leaf: 0.7209
- Mar 100 Powdery mildew leaf: 0.8863
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Angular leafspot | Mar 100 Angular leafspot | Map Anthracnose fruit rot | Mar 100 Anthracnose fruit rot | Map Blossom blight | Mar 100 Blossom blight | Map Gray mold | Mar 100 Gray mold | Map Leaf spot | Mar 100 Leaf spot | Map Powdery mildew fruit | Mar 100 Powdery mildew fruit | Map Powdery mildew leaf | Mar 100 Powdery mildew leaf |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:-------------------------:|:-----------------------------:|:------------------:|:----------------------:|:-------------:|:-----------------:|:-------------:|:-----------------:|:------------------------:|:----------------------------:|:-----------------------:|:---------------------------:|
| 48.1383 | 1.0 | 91 | 32.7441 | 0.072 | 0.0944 | 0.0789 | 0.0008 | 0.028 | 0.1218 | 0.1444 | 0.2815 | 0.3583 | 0.2 | 0.1977 | 0.4277 | 0.0097 | 0.4962 | 0.001 | 0.0529 | 0.0025 | 0.0727 | 0.0527 | 0.2222 | 0.1818 | 0.8401 | 0.0002 | 0.05 | 0.2557 | 0.7737 |
| 28.1321 | 2.0 | 182 | 26.8257 | 0.2203 | 0.2891 | 0.2395 | 0.1267 | 0.1186 | 0.2617 | 0.2987 | 0.5351 | 0.5913 | 0.4 | 0.3726 | 0.6649 | 0.0203 | 0.5981 | 0.0049 | 0.1706 | 0.1769 | 0.65 | 0.1961 | 0.6324 | 0.6355 | 0.9082 | 0.0087 | 0.3444 | 0.4998 | 0.8353 |
| 23.6192 | 3.0 | 273 | 23.0934 | 0.3141 | 0.408 | 0.3423 | 0.2133 | 0.195 | 0.3834 | 0.3621 | 0.6516 | 0.6971 | 0.4 | 0.4291 | 0.7797 | 0.0457 | 0.7288 | 0.0175 | 0.4059 | 0.4873 | 0.7205 | 0.2984 | 0.7102 | 0.7575 | 0.9249 | 0.0257 | 0.5444 | 0.5666 | 0.8451 |
| 20.6054 | 4.0 | 364 | 21.1837 | 0.398 | 0.5303 | 0.434 | 0.2115 | 0.237 | 0.4998 | 0.3852 | 0.7062 | 0.7568 | 0.5 | 0.5624 | 0.8306 | 0.264 | 0.7365 | 0.0368 | 0.5882 | 0.5386 | 0.75 | 0.434 | 0.7287 | 0.7965 | 0.9331 | 0.107 | 0.7056 | 0.6088 | 0.8557 |
| 19.4288 | 5.0 | 455 | 19.9111 | 0.4366 | 0.5924 | 0.4686 | 0.1758 | 0.256 | 0.5269 | 0.4005 | 0.7295 | 0.7887 | 0.4 | 0.5917 | 0.8558 | 0.3157 | 0.7788 | 0.0679 | 0.6471 | 0.5715 | 0.7636 | 0.4758 | 0.7389 | 0.8107 | 0.9315 | 0.1825 | 0.8 | 0.632 | 0.8608 |
| 17.8867 | 6.0 | 546 | 18.9302 | 0.5064 | 0.6834 | 0.5457 | 0.35 | 0.3232 | 0.5693 | 0.395 | 0.7462 | 0.8006 | 0.45 | 0.6427 | 0.8583 | 0.3472 | 0.7731 | 0.1802 | 0.7 | 0.6516 | 0.7909 | 0.5193 | 0.7398 | 0.8209 | 0.9362 | 0.3584 | 0.8 | 0.6671 | 0.8639 |
| 16.9985 | 7.0 | 637 | 18.2692 | 0.5458 | 0.7266 | 0.6036 | 0.2667 | 0.353 | 0.6138 | 0.4145 | 0.7701 | 0.8142 | 0.4 | 0.6488 | 0.876 | 0.4325 | 0.775 | 0.2601 | 0.7118 | 0.7072 | 0.8159 | 0.5295 | 0.7472 | 0.8566 | 0.9424 | 0.3656 | 0.8333 | 0.6692 | 0.8737 |
| 16.1493 | 8.0 | 728 | 17.3118 | 0.5707 | 0.7557 | 0.6298 | 0.4036 | 0.3684 | 0.6464 | 0.4175 | 0.7831 | 0.8275 | 0.45 | 0.6815 | 0.8822 | 0.4954 | 0.7923 | 0.3283 | 0.7588 | 0.7096 | 0.8159 | 0.5474 | 0.7528 | 0.8717 | 0.9436 | 0.3381 | 0.8444 | 0.7045 | 0.8847 |
| 15.3879 | 9.0 | 819 | 16.7216 | 0.5907 | 0.7634 | 0.6611 | 0.35 | 0.3824 | 0.6636 | 0.4145 | 0.7808 | 0.8258 | 0.45 | 0.6663 | 0.8821 | 0.5305 | 0.8019 | 0.3748 | 0.7529 | 0.7337 | 0.8136 | 0.5658 | 0.7574 | 0.8738 | 0.9377 | 0.3421 | 0.8333 | 0.7144 | 0.8835 |
| 14.4614 | 10.0 | 910 | 16.2566 | 0.6229 | 0.8172 | 0.6824 | 0.35 | 0.401 | 0.7146 | 0.4279 | 0.7886 | 0.831 | 0.5 | 0.7108 | 0.8802 | 0.54 | 0.8135 | 0.405 | 0.7118 | 0.7372 | 0.8159 | 0.5767 | 0.7648 | 0.8783 | 0.9416 | 0.5019 | 0.8833 | 0.7209 | 0.8863 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"angular leafspot",
"anthracnose fruit rot",
"blossom blight",
"gray mold",
"leaf spot",
"powdery mildew fruit",
"powdery mildew leaf"
] |
yeray142/finetune-instance-segmentation-ade20k-mini-mask2former_backbone_frozen_1 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetune-instance-segmentation-ade20k-mini-mask2former_backbone_frozen_1
This model is a fine-tuned version of [facebook/mask2former-swin-tiny-coco-instance](https://huggingface.co/facebook/mask2former-swin-tiny-coco-instance) on the yeray142/kitti-mots-instance dataset.
It achieves the following results on the evaluation set:
- Loss: 22.5821
- Map: 0.181
- Map 50: 0.35
- Map 75: 0.164
- Map Small: 0.0954
- Map Medium: 0.3758
- Map Large: 0.9135
- Mar 1: 0.0856
- Mar 10: 0.2359
- Mar 100: 0.2819
- Mar Small: 0.2158
- Mar Medium: 0.4688
- Mar Large: 0.9371
- Map Car: 0.3473
- Mar 100 Car: 0.4911
- Map Person: 0.0147
- Mar 100 Person: 0.0727
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Car | Mar 100 Car | Map Person | Mar 100 Person |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:----------:|:--------------:|
| 36.0195 | 1.0 | 315 | 27.1721 | 0.1432 | 0.279 | 0.1349 | 0.0688 | 0.3117 | 0.8516 | 0.0714 | 0.1887 | 0.2291 | 0.1605 | 0.4115 | 0.9133 | 0.2849 | 0.4372 | 0.0014 | 0.0209 |
| 28.6156 | 2.0 | 630 | 25.2879 | 0.1597 | 0.2996 | 0.1521 | 0.0779 | 0.337 | 0.8789 | 0.078 | 0.2055 | 0.2497 | 0.1825 | 0.4386 | 0.9173 | 0.3147 | 0.4628 | 0.0046 | 0.0367 |
| 26.5961 | 3.0 | 945 | 24.4641 | 0.1679 | 0.3171 | 0.1572 | 0.0817 | 0.3481 | 0.8921 | 0.0809 | 0.2137 | 0.259 | 0.193 | 0.4441 | 0.9229 | 0.326 | 0.473 | 0.0098 | 0.045 |
| 25.435 | 4.0 | 1260 | 24.1017 | 0.1701 | 0.3217 | 0.1577 | 0.0815 | 0.3562 | 0.9002 | 0.0816 | 0.2169 | 0.2601 | 0.1929 | 0.4479 | 0.9272 | 0.329 | 0.469 | 0.0112 | 0.0512 |
| 24.7221 | 5.0 | 1575 | 23.5551 | 0.1738 | 0.3286 | 0.162 | 0.0855 | 0.3615 | 0.8986 | 0.0833 | 0.2226 | 0.2717 | 0.2053 | 0.462 | 0.9276 | 0.3357 | 0.4836 | 0.0119 | 0.0597 |
| 24.1639 | 6.0 | 1890 | 23.3457 | 0.1761 | 0.3319 | 0.1641 | 0.0891 | 0.3606 | 0.9016 | 0.0838 | 0.2249 | 0.267 | 0.2003 | 0.4539 | 0.9265 | 0.3412 | 0.4749 | 0.0111 | 0.059 |
| 23.581 | 7.0 | 2205 | 23.0218 | 0.1801 | 0.3415 | 0.1687 | 0.0924 | 0.3682 | 0.9111 | 0.0854 | 0.2295 | 0.2735 | 0.2061 | 0.4619 | 0.9356 | 0.3475 | 0.4829 | 0.0126 | 0.064 |
| 23.1336 | 8.0 | 2520 | 22.8133 | 0.1817 | 0.3458 | 0.1673 | 0.0959 | 0.3723 | 0.9122 | 0.0852 | 0.234 | 0.2782 | 0.2108 | 0.4682 | 0.9352 | 0.3507 | 0.4868 | 0.0127 | 0.0697 |
| 22.6498 | 9.0 | 2835 | 22.8115 | 0.1823 | 0.3478 | 0.1683 | 0.0948 | 0.3758 | 0.9137 | 0.0856 | 0.2347 | 0.2818 | 0.2148 | 0.4691 | 0.9358 | 0.3508 | 0.4884 | 0.0139 | 0.0751 |
| 22.3868 | 9.9698 | 3140 | 22.5821 | 0.181 | 0.35 | 0.164 | 0.0954 | 0.3758 | 0.9135 | 0.0856 | 0.2359 | 0.2819 | 0.2158 | 0.4688 | 0.9371 | 0.3473 | 0.4911 | 0.0147 | 0.0727 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"car",
"person"
] |
yeray142/finetune-instance-segmentation-mini-mask2former_augmentation_default_backboneFrozen |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetune-instance-segmentation-mini-mask2former_augmentation_default_backboneFrozen
This model is a fine-tuned version of [facebook/mask2former-swin-tiny-coco-instance](https://huggingface.co/facebook/mask2former-swin-tiny-coco-instance) on the jsalavedra/strawberry_disease dataset.
It achieves the following results on the evaluation set:
- Loss: 19.5597
- Map: 0.4279
- Map 50: 0.592
- Map 75: 0.476
- Map Small: 0.3002
- Map Medium: 0.2838
- Map Large: 0.4955
- Mar 1: 0.3752
- Mar 10: 0.7188
- Mar 100: 0.7682
- Mar Small: 0.45
- Mar Medium: 0.5977
- Mar Large: 0.8285
- Map Angular leafspot: 0.162
- Mar 100 Angular leafspot: 0.7154
- Map Anthracnose fruit rot: 0.1562
- Mar 100 Anthracnose fruit rot: 0.6118
- Map Blossom blight: 0.6367
- Mar 100 Blossom blight: 0.7455
- Map Gray mold: 0.4846
- Mar 100 Gray mold: 0.7259
- Map Leaf spot: 0.7665
- Mar 100 Leaf spot: 0.9253
- Map Powdery mildew fruit: 0.1744
- Mar 100 Powdery mildew fruit: 0.8056
- Map Powdery mildew leaf: 0.6152
- Mar 100 Powdery mildew leaf: 0.8478
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Angular leafspot | Mar 100 Angular leafspot | Map Anthracnose fruit rot | Mar 100 Anthracnose fruit rot | Map Blossom blight | Mar 100 Blossom blight | Map Gray mold | Mar 100 Gray mold | Map Leaf spot | Mar 100 Leaf spot | Map Powdery mildew fruit | Mar 100 Powdery mildew fruit | Map Powdery mildew leaf | Mar 100 Powdery mildew leaf |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:-------------------------:|:-----------------------------:|:------------------:|:----------------------:|:-------------:|:-----------------:|:-------------:|:-----------------:|:------------------------:|:----------------------------:|:-----------------------:|:---------------------------:|
| 53.1072 | 1.0 | 91 | 35.8147 | 0.0329 | 0.0466 | 0.0372 | 0.0 | 0.0194 | 0.052 | 0.0866 | 0.2227 | 0.2877 | 0.0 | 0.1691 | 0.3315 | 0.0011 | 0.0577 | 0.0005 | 0.0353 | 0.0005 | 0.0455 | 0.1032 | 0.3204 | 0.0595 | 0.7537 | 0.0003 | 0.0611 | 0.0653 | 0.74 |
| 31.9378 | 2.0 | 182 | 31.1508 | 0.146 | 0.1929 | 0.1592 | 0.0002 | 0.0603 | 0.1944 | 0.1728 | 0.3353 | 0.3875 | 0.05 | 0.2558 | 0.456 | 0.0361 | 0.1942 | 0.0012 | 0.0353 | 0.0142 | 0.1909 | 0.2208 | 0.5352 | 0.4042 | 0.8455 | 0.0017 | 0.1222 | 0.3436 | 0.7894 |
| 28.4456 | 3.0 | 273 | 27.9586 | 0.2437 | 0.3525 | 0.2638 | 0.0043 | 0.1252 | 0.2879 | 0.2934 | 0.5682 | 0.6154 | 0.15 | 0.4384 | 0.695 | 0.0633 | 0.55 | 0.0105 | 0.1824 | 0.297 | 0.5682 | 0.3008 | 0.6528 | 0.5844 | 0.8805 | 0.0137 | 0.6778 | 0.4367 | 0.7961 |
| 25.1946 | 4.0 | 364 | 25.4498 | 0.2996 | 0.4292 | 0.3245 | 0.0868 | 0.1635 | 0.3718 | 0.315 | 0.6344 | 0.6812 | 0.45 | 0.5052 | 0.7494 | 0.0876 | 0.5827 | 0.0309 | 0.4 | 0.4503 | 0.6591 | 0.3582 | 0.6889 | 0.6635 | 0.8946 | 0.0214 | 0.7333 | 0.4852 | 0.8098 |
| 23.2545 | 5.0 | 455 | 23.8964 | 0.3353 | 0.4803 | 0.3614 | 0.1095 | 0.19 | 0.4087 | 0.3296 | 0.672 | 0.7218 | 0.4 | 0.5347 | 0.7931 | 0.1005 | 0.6288 | 0.0936 | 0.5235 | 0.5137 | 0.6909 | 0.4064 | 0.6972 | 0.675 | 0.9019 | 0.038 | 0.7833 | 0.5196 | 0.8267 |
| 21.9945 | 6.0 | 546 | 22.5007 | 0.366 | 0.5185 | 0.3862 | 0.1667 | 0.2226 | 0.4397 | 0.3526 | 0.6857 | 0.729 | 0.4 | 0.5301 | 0.8134 | 0.1146 | 0.6788 | 0.1253 | 0.5294 | 0.578 | 0.7114 | 0.4334 | 0.7 | 0.701 | 0.9082 | 0.0576 | 0.75 | 0.5525 | 0.8255 |
| 20.5361 | 7.0 | 637 | 21.4663 | 0.383 | 0.5312 | 0.4124 | 0.3 | 0.2354 | 0.4501 | 0.3424 | 0.6987 | 0.7456 | 0.4 | 0.5526 | 0.8183 | 0.1369 | 0.7 | 0.1323 | 0.5941 | 0.5833 | 0.7227 | 0.449 | 0.7111 | 0.7275 | 0.9097 | 0.0821 | 0.75 | 0.5701 | 0.8318 |
| 19.5394 | 8.0 | 728 | 20.5817 | 0.4007 | 0.5596 | 0.4369 | 0.3 | 0.2559 | 0.4733 | 0.3649 | 0.7059 | 0.7526 | 0.4 | 0.5733 | 0.8147 | 0.1224 | 0.6923 | 0.161 | 0.6059 | 0.5924 | 0.7205 | 0.4677 | 0.7204 | 0.7382 | 0.9179 | 0.1318 | 0.7722 | 0.5918 | 0.8388 |
| 18.9893 | 9.0 | 819 | 20.0124 | 0.4102 | 0.5697 | 0.4529 | 0.3 | 0.266 | 0.478 | 0.3705 | 0.7166 | 0.7613 | 0.4 | 0.5824 | 0.8326 | 0.1426 | 0.7 | 0.1322 | 0.6412 | 0.6233 | 0.7409 | 0.4696 | 0.7194 | 0.7467 | 0.9195 | 0.1517 | 0.7667 | 0.6049 | 0.8416 |
| 18.3583 | 10.0 | 910 | 19.5597 | 0.4279 | 0.592 | 0.476 | 0.3002 | 0.2838 | 0.4955 | 0.3752 | 0.7188 | 0.7682 | 0.45 | 0.5977 | 0.8285 | 0.162 | 0.7154 | 0.1562 | 0.6118 | 0.6367 | 0.7455 | 0.4846 | 0.7259 | 0.7665 | 0.9253 | 0.1744 | 0.8056 | 0.6152 | 0.8478 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"angular leafspot",
"anthracnose fruit rot",
"blossom blight",
"gray mold",
"leaf spot",
"powdery mildew fruit",
"powdery mildew leaf"
] |
mujerry/mit-b0_necrosis |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mit-b0_necrosis
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0528
- Mean Iou: 0.8813
- Mean Accuracy: 0.9318
- Overall Accuracy: 0.9815
- Accuracy Background: 0.9940
- Accuracy Necrosis: 0.8416
- Accuracy Root: 0.9598
- Iou Background: 0.9877
- Iou Necrosis: 0.7339
- Iou Root: 0.9224
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Necrosis | Accuracy Root | Iou Background | Iou Necrosis | Iou Root |
|:-------------:|:--------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------:|:-------------:|:--------------:|:------------:|:--------:|
| 1.0774 | 0.9091 | 20 | 1.0862 | 0.2257 | 0.4936 | 0.3667 | 0.2174 | 0.3590 | 0.9045 | 0.2172 | 0.0290 | 0.4308 |
| 0.79 | 1.8182 | 40 | 0.8729 | 0.5114 | 0.6752 | 0.8224 | 0.8068 | 0.2428 | 0.9761 | 0.8053 | 0.0818 | 0.6471 |
| 0.5765 | 2.7273 | 60 | 0.5984 | 0.6150 | 0.7260 | 0.9117 | 0.9226 | 0.2759 | 0.9794 | 0.9197 | 0.1919 | 0.7335 |
| 0.4356 | 3.6364 | 80 | 0.4563 | 0.7147 | 0.8218 | 0.9384 | 0.9483 | 0.5489 | 0.9683 | 0.9448 | 0.4062 | 0.7931 |
| 0.4001 | 4.5455 | 100 | 0.3937 | 0.7578 | 0.8837 | 0.9442 | 0.9511 | 0.7475 | 0.9525 | 0.9478 | 0.5178 | 0.8078 |
| 0.372 | 5.4545 | 120 | 0.3180 | 0.7836 | 0.8850 | 0.9531 | 0.9617 | 0.7345 | 0.9587 | 0.9578 | 0.5651 | 0.8279 |
| 0.2767 | 6.3636 | 140 | 0.2898 | 0.7942 | 0.9005 | 0.9541 | 0.9612 | 0.7829 | 0.9576 | 0.9574 | 0.5929 | 0.8322 |
| 0.26 | 7.2727 | 160 | 0.2609 | 0.7970 | 0.8839 | 0.9542 | 0.9600 | 0.7185 | 0.9731 | 0.9565 | 0.6054 | 0.8290 |
| 0.2399 | 8.1818 | 180 | 0.2031 | 0.8365 | 0.9099 | 0.9683 | 0.9782 | 0.7887 | 0.9628 | 0.9730 | 0.6605 | 0.8761 |
| 0.1764 | 9.0909 | 200 | 0.1796 | 0.8441 | 0.9299 | 0.9724 | 0.9886 | 0.8701 | 0.9311 | 0.9801 | 0.6641 | 0.8882 |
| 0.1436 | 10.0 | 220 | 0.1594 | 0.8520 | 0.9046 | 0.9752 | 0.9886 | 0.7623 | 0.9630 | 0.9813 | 0.6742 | 0.9004 |
| 0.1332 | 10.9091 | 240 | 0.1391 | 0.8572 | 0.9300 | 0.9755 | 0.9920 | 0.8626 | 0.9354 | 0.9825 | 0.6892 | 0.8998 |
| 0.1218 | 11.8182 | 260 | 0.1351 | 0.8519 | 0.9349 | 0.9744 | 0.9903 | 0.8812 | 0.9333 | 0.9822 | 0.6776 | 0.8959 |
| 0.1128 | 12.7273 | 280 | 0.1128 | 0.8644 | 0.9170 | 0.9774 | 0.9930 | 0.8082 | 0.9499 | 0.9833 | 0.7022 | 0.9077 |
| 0.0997 | 13.6364 | 300 | 0.1102 | 0.8603 | 0.9145 | 0.9763 | 0.9881 | 0.7900 | 0.9654 | 0.9822 | 0.6948 | 0.9038 |
| 0.0914 | 14.5455 | 320 | 0.1025 | 0.8661 | 0.9256 | 0.9776 | 0.9929 | 0.8378 | 0.9462 | 0.9838 | 0.7067 | 0.9078 |
| 0.0958 | 15.4545 | 340 | 0.0968 | 0.8653 | 0.9096 | 0.9779 | 0.9931 | 0.7792 | 0.9565 | 0.9835 | 0.7030 | 0.9095 |
| 0.0864 | 16.3636 | 360 | 0.0903 | 0.8672 | 0.9139 | 0.9784 | 0.9909 | 0.7847 | 0.9660 | 0.9844 | 0.7064 | 0.9109 |
| 0.0817 | 17.2727 | 380 | 0.0900 | 0.8543 | 0.8939 | 0.9774 | 0.9924 | 0.7229 | 0.9664 | 0.9843 | 0.6714 | 0.9071 |
| 0.0737 | 18.1818 | 400 | 0.0841 | 0.8661 | 0.9204 | 0.9775 | 0.9908 | 0.8130 | 0.9574 | 0.9834 | 0.7071 | 0.9078 |
| 0.0828 | 19.0909 | 420 | 0.0858 | 0.8697 | 0.9343 | 0.9786 | 0.9916 | 0.8593 | 0.9521 | 0.9853 | 0.7117 | 0.9120 |
| 0.0907 | 20.0 | 440 | 0.0799 | 0.8706 | 0.9186 | 0.9784 | 0.9945 | 0.8131 | 0.9482 | 0.9840 | 0.7180 | 0.9096 |
| 0.0776 | 20.9091 | 460 | 0.0769 | 0.8733 | 0.9256 | 0.9796 | 0.9935 | 0.8283 | 0.9551 | 0.9859 | 0.7192 | 0.9149 |
| 0.0662 | 21.8182 | 480 | 0.0734 | 0.8698 | 0.9184 | 0.9793 | 0.9921 | 0.7996 | 0.9634 | 0.9856 | 0.7094 | 0.9143 |
| 0.071 | 22.7273 | 500 | 0.0727 | 0.8671 | 0.9088 | 0.9791 | 0.9932 | 0.7694 | 0.9637 | 0.9854 | 0.7023 | 0.9135 |
| 0.0563 | 23.6364 | 520 | 0.0701 | 0.8723 | 0.9325 | 0.9794 | 0.9922 | 0.8505 | 0.9548 | 0.9861 | 0.7168 | 0.9138 |
| 0.0666 | 24.5455 | 540 | 0.0730 | 0.8678 | 0.9177 | 0.9786 | 0.9889 | 0.7913 | 0.9731 | 0.9849 | 0.7067 | 0.9118 |
| 0.0662 | 25.4545 | 560 | 0.0677 | 0.8743 | 0.9269 | 0.9795 | 0.9945 | 0.8365 | 0.9498 | 0.9856 | 0.7232 | 0.9142 |
| 0.0566 | 26.3636 | 580 | 0.0678 | 0.8737 | 0.9329 | 0.9794 | 0.9936 | 0.8561 | 0.9492 | 0.9859 | 0.7214 | 0.9138 |
| 0.0536 | 27.2727 | 600 | 0.0646 | 0.8745 | 0.9278 | 0.9800 | 0.9930 | 0.8320 | 0.9584 | 0.9866 | 0.7206 | 0.9164 |
| 0.0736 | 28.1818 | 620 | 0.0671 | 0.8657 | 0.9091 | 0.9790 | 0.9914 | 0.7651 | 0.9707 | 0.9856 | 0.6980 | 0.9134 |
| 0.0651 | 29.0909 | 640 | 0.0631 | 0.8717 | 0.9166 | 0.9800 | 0.9920 | 0.7889 | 0.9688 | 0.9864 | 0.7123 | 0.9166 |
| 0.0484 | 30.0 | 660 | 0.0627 | 0.8754 | 0.9213 | 0.9803 | 0.9926 | 0.8058 | 0.9655 | 0.9865 | 0.7217 | 0.9180 |
| 0.0527 | 30.9091 | 680 | 0.0620 | 0.8762 | 0.9316 | 0.9801 | 0.9912 | 0.8395 | 0.9639 | 0.9865 | 0.7251 | 0.9171 |
| 0.0592 | 31.8182 | 700 | 0.0646 | 0.8708 | 0.9337 | 0.9789 | 0.9947 | 0.8652 | 0.9411 | 0.9859 | 0.7160 | 0.9106 |
| 0.0591 | 32.7273 | 720 | 0.0608 | 0.8749 | 0.9204 | 0.9805 | 0.9929 | 0.8026 | 0.9656 | 0.9869 | 0.7196 | 0.9181 |
| 0.052 | 33.6364 | 740 | 0.0603 | 0.8773 | 0.9283 | 0.9803 | 0.9942 | 0.8360 | 0.9546 | 0.9865 | 0.7282 | 0.9174 |
| 0.0506 | 34.5455 | 760 | 0.0611 | 0.8758 | 0.9367 | 0.9799 | 0.9932 | 0.8654 | 0.9514 | 0.9867 | 0.7252 | 0.9155 |
| 0.0467 | 35.4545 | 780 | 0.0601 | 0.8771 | 0.9336 | 0.9803 | 0.9944 | 0.8560 | 0.9503 | 0.9868 | 0.7274 | 0.9170 |
| 0.0466 | 36.3636 | 800 | 0.0578 | 0.8779 | 0.9249 | 0.9810 | 0.9937 | 0.8186 | 0.9623 | 0.9873 | 0.7263 | 0.9202 |
| 0.0456 | 37.2727 | 820 | 0.0583 | 0.8757 | 0.9191 | 0.9806 | 0.9944 | 0.8020 | 0.9607 | 0.9868 | 0.7217 | 0.9185 |
| 0.0481 | 38.1818 | 840 | 0.0606 | 0.8642 | 0.9006 | 0.9798 | 0.9940 | 0.7383 | 0.9696 | 0.9869 | 0.6897 | 0.9159 |
| 0.0601 | 39.0909 | 860 | 0.0566 | 0.8743 | 0.9159 | 0.9808 | 0.9936 | 0.7863 | 0.9677 | 0.9874 | 0.7155 | 0.9198 |
| 0.0441 | 40.0 | 880 | 0.0568 | 0.8799 | 0.9306 | 0.9810 | 0.9934 | 0.8377 | 0.9608 | 0.9873 | 0.7319 | 0.9204 |
| 0.0416 | 40.9091 | 900 | 0.0569 | 0.8779 | 0.9313 | 0.9806 | 0.9948 | 0.8469 | 0.9520 | 0.9871 | 0.7286 | 0.9180 |
| 0.0489 | 41.8182 | 920 | 0.0557 | 0.8797 | 0.9288 | 0.9813 | 0.9940 | 0.8318 | 0.9605 | 0.9876 | 0.7300 | 0.9215 |
| 0.0477 | 42.7273 | 940 | 0.0564 | 0.8759 | 0.9191 | 0.9809 | 0.9935 | 0.7973 | 0.9666 | 0.9873 | 0.7204 | 0.9200 |
| 0.045 | 43.6364 | 960 | 0.0583 | 0.8774 | 0.9408 | 0.9801 | 0.9910 | 0.8721 | 0.9592 | 0.9868 | 0.7284 | 0.9171 |
| 0.0402 | 44.5455 | 980 | 0.0561 | 0.8795 | 0.9368 | 0.9808 | 0.9951 | 0.8664 | 0.9488 | 0.9874 | 0.7321 | 0.9190 |
| 0.0463 | 45.4545 | 1000 | 0.0557 | 0.8761 | 0.9182 | 0.9811 | 0.9928 | 0.7907 | 0.9711 | 0.9876 | 0.7198 | 0.9209 |
| 0.0356 | 46.3636 | 1020 | 0.0559 | 0.8793 | 0.9308 | 0.9810 | 0.9945 | 0.8421 | 0.9558 | 0.9874 | 0.7306 | 0.9199 |
| 0.0446 | 47.2727 | 1040 | 0.0569 | 0.8749 | 0.9249 | 0.9799 | 0.9963 | 0.8323 | 0.9461 | 0.9863 | 0.7233 | 0.9151 |
| 0.041 | 48.1818 | 1060 | 0.0561 | 0.8772 | 0.9299 | 0.9807 | 0.9931 | 0.8360 | 0.9606 | 0.9872 | 0.7246 | 0.9198 |
| 0.042 | 49.0909 | 1080 | 0.0543 | 0.8815 | 0.9339 | 0.9815 | 0.9931 | 0.8457 | 0.9630 | 0.9880 | 0.7339 | 0.9226 |
| 0.0502 | 50.0 | 1100 | 0.0557 | 0.8742 | 0.9168 | 0.9809 | 0.9937 | 0.7896 | 0.9671 | 0.9876 | 0.7148 | 0.9203 |
| 0.0403 | 50.9091 | 1120 | 0.0552 | 0.8771 | 0.9231 | 0.9810 | 0.9917 | 0.8061 | 0.9716 | 0.9873 | 0.7235 | 0.9205 |
| 0.0412 | 51.8182 | 1140 | 0.0544 | 0.8779 | 0.9203 | 0.9811 | 0.9951 | 0.8056 | 0.9603 | 0.9873 | 0.7257 | 0.9205 |
| 0.0426 | 52.7273 | 1160 | 0.0539 | 0.8797 | 0.9310 | 0.9811 | 0.9943 | 0.8420 | 0.9568 | 0.9874 | 0.7311 | 0.9204 |
| 0.0423 | 53.6364 | 1180 | 0.0538 | 0.8795 | 0.9291 | 0.9813 | 0.9940 | 0.8325 | 0.9610 | 0.9878 | 0.7289 | 0.9217 |
| 0.0302 | 54.5455 | 1200 | 0.0545 | 0.8778 | 0.9207 | 0.9812 | 0.9940 | 0.8030 | 0.9652 | 0.9875 | 0.7246 | 0.9213 |
| 0.0563 | 55.4545 | 1220 | 0.0546 | 0.8786 | 0.9303 | 0.9812 | 0.9935 | 0.8360 | 0.9613 | 0.9878 | 0.7267 | 0.9213 |
| 0.0462 | 56.3636 | 1240 | 0.0544 | 0.8758 | 0.9166 | 0.9811 | 0.9939 | 0.7887 | 0.9673 | 0.9874 | 0.7191 | 0.9210 |
| 0.0323 | 57.2727 | 1260 | 0.0549 | 0.8784 | 0.9242 | 0.9812 | 0.9924 | 0.8105 | 0.9697 | 0.9875 | 0.7261 | 0.9215 |
| 0.036 | 58.1818 | 1280 | 0.0528 | 0.8809 | 0.9357 | 0.9814 | 0.9939 | 0.8554 | 0.9577 | 0.9880 | 0.7326 | 0.9220 |
| 0.0368 | 59.0909 | 1300 | 0.0553 | 0.8770 | 0.9240 | 0.9809 | 0.9929 | 0.8131 | 0.9662 | 0.9872 | 0.7229 | 0.9208 |
| 0.0406 | 60.0 | 1320 | 0.0549 | 0.8726 | 0.9119 | 0.9809 | 0.9947 | 0.7753 | 0.9658 | 0.9875 | 0.7101 | 0.9203 |
| 0.0316 | 60.9091 | 1340 | 0.0545 | 0.8788 | 0.9244 | 0.9811 | 0.9942 | 0.8177 | 0.9612 | 0.9870 | 0.7280 | 0.9213 |
| 0.0343 | 61.8182 | 1360 | 0.0530 | 0.8808 | 0.9294 | 0.9815 | 0.9937 | 0.8317 | 0.9628 | 0.9877 | 0.7321 | 0.9225 |
| 0.0342 | 62.7273 | 1380 | 0.0535 | 0.8801 | 0.9315 | 0.9813 | 0.9934 | 0.8396 | 0.9614 | 0.9875 | 0.7306 | 0.9221 |
| 0.0515 | 63.6364 | 1400 | 0.0543 | 0.8794 | 0.9299 | 0.9812 | 0.9933 | 0.8340 | 0.9623 | 0.9873 | 0.7290 | 0.9219 |
| 0.0312 | 64.5455 | 1420 | 0.0533 | 0.8798 | 0.9312 | 0.9813 | 0.9944 | 0.8419 | 0.9573 | 0.9876 | 0.7302 | 0.9216 |
| 0.0359 | 65.4545 | 1440 | 0.0548 | 0.8785 | 0.9351 | 0.9809 | 0.9928 | 0.8527 | 0.9597 | 0.9874 | 0.7273 | 0.9207 |
| 0.0333 | 66.3636 | 1460 | 0.0521 | 0.8813 | 0.9276 | 0.9817 | 0.9948 | 0.8276 | 0.9604 | 0.9878 | 0.7327 | 0.9233 |
| 0.0358 | 67.2727 | 1480 | 0.0531 | 0.8797 | 0.9284 | 0.9814 | 0.9939 | 0.8292 | 0.9622 | 0.9878 | 0.7290 | 0.9221 |
| 0.0341 | 68.1818 | 1500 | 0.0528 | 0.8813 | 0.9319 | 0.9815 | 0.9937 | 0.8410 | 0.9611 | 0.9877 | 0.7336 | 0.9226 |
| 0.0316 | 69.0909 | 1520 | 0.0551 | 0.8763 | 0.9212 | 0.9810 | 0.9918 | 0.7992 | 0.9727 | 0.9875 | 0.7209 | 0.9206 |
| 0.0261 | 70.0 | 1540 | 0.0534 | 0.8803 | 0.9286 | 0.9815 | 0.9930 | 0.8270 | 0.9659 | 0.9877 | 0.7308 | 0.9225 |
| 0.0331 | 70.9091 | 1560 | 0.0551 | 0.8764 | 0.9190 | 0.9810 | 0.9927 | 0.7937 | 0.9705 | 0.9874 | 0.7210 | 0.9209 |
| 0.0398 | 71.8182 | 1580 | 0.0547 | 0.8758 | 0.9183 | 0.9811 | 0.9930 | 0.7917 | 0.9701 | 0.9876 | 0.7186 | 0.9212 |
| 0.0368 | 72.7273 | 1600 | 0.0524 | 0.8804 | 0.9280 | 0.9814 | 0.9943 | 0.8288 | 0.9609 | 0.9877 | 0.7312 | 0.9222 |
| 0.0368 | 73.6364 | 1620 | 0.0526 | 0.8811 | 0.9268 | 0.9817 | 0.9939 | 0.8217 | 0.9648 | 0.9879 | 0.7320 | 0.9234 |
| 0.0319 | 74.5455 | 1640 | 0.0528 | 0.8812 | 0.9337 | 0.9814 | 0.9934 | 0.8464 | 0.9612 | 0.9878 | 0.7335 | 0.9223 |
| 0.0321 | 75.4545 | 1660 | 0.0534 | 0.8802 | 0.9299 | 0.9813 | 0.9938 | 0.8349 | 0.9611 | 0.9876 | 0.7310 | 0.9221 |
| 0.0339 | 76.3636 | 1680 | 0.0539 | 0.8794 | 0.9283 | 0.9813 | 0.9927 | 0.8256 | 0.9667 | 0.9877 | 0.7285 | 0.9221 |
| 0.0289 | 77.2727 | 1700 | 0.0530 | 0.8795 | 0.9317 | 0.9813 | 0.9940 | 0.8421 | 0.9589 | 0.9878 | 0.7289 | 0.9218 |
| 0.0355 | 78.1818 | 1720 | 0.0540 | 0.8797 | 0.9348 | 0.9811 | 0.9940 | 0.8546 | 0.9559 | 0.9875 | 0.7307 | 0.9209 |
| 0.037 | 79.0909 | 1740 | 0.0531 | 0.8805 | 0.9360 | 0.9812 | 0.9932 | 0.8554 | 0.9594 | 0.9877 | 0.7320 | 0.9217 |
| 0.0354 | 80.0 | 1760 | 0.0530 | 0.8808 | 0.9319 | 0.9813 | 0.9946 | 0.8446 | 0.9566 | 0.9876 | 0.7334 | 0.9215 |
| 0.0335 | 80.9091 | 1780 | 0.0528 | 0.8805 | 0.9359 | 0.9813 | 0.9932 | 0.8549 | 0.9597 | 0.9878 | 0.7319 | 0.9217 |
| 0.0311 | 81.8182 | 1800 | 0.0529 | 0.8809 | 0.9361 | 0.9813 | 0.9933 | 0.8555 | 0.9597 | 0.9878 | 0.7328 | 0.9221 |
| 0.0395 | 82.7273 | 1820 | 0.0543 | 0.8772 | 0.9413 | 0.9805 | 0.9947 | 0.8838 | 0.9454 | 0.9875 | 0.7264 | 0.9176 |
| 0.0424 | 83.6364 | 1840 | 0.0540 | 0.8791 | 0.9403 | 0.9809 | 0.9934 | 0.8733 | 0.9541 | 0.9877 | 0.7294 | 0.9203 |
| 0.035 | 84.5455 | 1860 | 0.0522 | 0.8814 | 0.9275 | 0.9817 | 0.9942 | 0.8255 | 0.9629 | 0.9879 | 0.7331 | 0.9232 |
| 0.0418 | 85.4545 | 1880 | 0.0522 | 0.8803 | 0.9285 | 0.9815 | 0.9941 | 0.8297 | 0.9617 | 0.9879 | 0.7305 | 0.9225 |
| 0.0365 | 86.3636 | 1900 | 0.0536 | 0.8802 | 0.9378 | 0.9812 | 0.9935 | 0.8631 | 0.9569 | 0.9878 | 0.7315 | 0.9213 |
| 0.0344 | 87.2727 | 1920 | 0.0532 | 0.8809 | 0.9364 | 0.9812 | 0.9934 | 0.8580 | 0.9579 | 0.9875 | 0.7341 | 0.9211 |
| 0.0257 | 88.1818 | 1940 | 0.0540 | 0.8775 | 0.9189 | 0.9814 | 0.9932 | 0.7925 | 0.9710 | 0.9879 | 0.7223 | 0.9223 |
| 0.0333 | 89.0909 | 1960 | 0.0534 | 0.8797 | 0.9253 | 0.9814 | 0.9926 | 0.8140 | 0.9693 | 0.9876 | 0.7289 | 0.9226 |
| 0.0407 | 90.0 | 1980 | 0.0531 | 0.8818 | 0.9335 | 0.9814 | 0.9935 | 0.8463 | 0.9608 | 0.9876 | 0.7354 | 0.9225 |
| 0.031 | 90.9091 | 2000 | 0.0529 | 0.8822 | 0.9345 | 0.9815 | 0.9935 | 0.8496 | 0.9604 | 0.9876 | 0.7366 | 0.9224 |
| 0.0371 | 91.8182 | 2020 | 0.0525 | 0.8810 | 0.9316 | 0.9815 | 0.9934 | 0.8388 | 0.9628 | 0.9878 | 0.7325 | 0.9226 |
| 0.0418 | 92.7273 | 2040 | 0.0526 | 0.8809 | 0.9302 | 0.9815 | 0.9932 | 0.8329 | 0.9644 | 0.9878 | 0.7321 | 0.9229 |
| 0.0337 | 93.6364 | 2060 | 0.0530 | 0.8808 | 0.9279 | 0.9815 | 0.9936 | 0.8259 | 0.9641 | 0.9877 | 0.7318 | 0.9228 |
| 0.0336 | 94.5455 | 2080 | 0.0525 | 0.8810 | 0.9289 | 0.9815 | 0.9944 | 0.8320 | 0.9602 | 0.9877 | 0.7331 | 0.9222 |
| 0.0343 | 95.4545 | 2100 | 0.0527 | 0.8811 | 0.9285 | 0.9816 | 0.9937 | 0.8279 | 0.9639 | 0.9878 | 0.7323 | 0.9233 |
| 0.0337 | 96.3636 | 2120 | 0.0535 | 0.8809 | 0.9311 | 0.9815 | 0.9933 | 0.8367 | 0.9633 | 0.9877 | 0.7322 | 0.9228 |
| 0.0297 | 97.2727 | 2140 | 0.0527 | 0.8804 | 0.9292 | 0.9815 | 0.9937 | 0.8312 | 0.9627 | 0.9877 | 0.7308 | 0.9228 |
| 0.0302 | 98.1818 | 2160 | 0.0529 | 0.8790 | 0.9227 | 0.9815 | 0.9941 | 0.8084 | 0.9655 | 0.9878 | 0.7265 | 0.9228 |
| 0.037 | 99.0909 | 2180 | 0.0534 | 0.8814 | 0.9331 | 0.9815 | 0.9936 | 0.8448 | 0.9610 | 0.9877 | 0.7338 | 0.9226 |
| 0.0591 | 100.0 | 2200 | 0.0520 | 0.8812 | 0.9305 | 0.9816 | 0.9940 | 0.8364 | 0.9611 | 0.9878 | 0.7330 | 0.9229 |
| 0.0294 | 100.9091 | 2220 | 0.0538 | 0.8792 | 0.9235 | 0.9815 | 0.9932 | 0.8091 | 0.9682 | 0.9877 | 0.7270 | 0.9228 |
| 0.0301 | 101.8182 | 2240 | 0.0530 | 0.8817 | 0.9333 | 0.9815 | 0.9930 | 0.8434 | 0.9634 | 0.9877 | 0.7347 | 0.9227 |
| 0.0286 | 102.7273 | 2260 | 0.0533 | 0.8814 | 0.9297 | 0.9816 | 0.9933 | 0.8310 | 0.9647 | 0.9877 | 0.7334 | 0.9230 |
| 0.0332 | 103.6364 | 2280 | 0.0523 | 0.8823 | 0.9335 | 0.9816 | 0.9942 | 0.8474 | 0.9589 | 0.9878 | 0.7360 | 0.9229 |
| 0.0329 | 104.5455 | 2300 | 0.0526 | 0.8820 | 0.9324 | 0.9816 | 0.9942 | 0.8436 | 0.9593 | 0.9878 | 0.7354 | 0.9228 |
| 0.0295 | 105.4545 | 2320 | 0.0528 | 0.8816 | 0.9326 | 0.9814 | 0.9945 | 0.8461 | 0.9571 | 0.9877 | 0.7353 | 0.9220 |
| 0.0288 | 106.3636 | 2340 | 0.0526 | 0.8813 | 0.9288 | 0.9816 | 0.9939 | 0.8295 | 0.9629 | 0.9878 | 0.7330 | 0.9230 |
| 0.0352 | 107.2727 | 2360 | 0.0527 | 0.8814 | 0.9309 | 0.9816 | 0.9937 | 0.8365 | 0.9625 | 0.9878 | 0.7334 | 0.9230 |
| 0.0334 | 108.1818 | 2380 | 0.0526 | 0.8816 | 0.9315 | 0.9816 | 0.9944 | 0.8407 | 0.9592 | 0.9879 | 0.7342 | 0.9229 |
| 0.033 | 109.0909 | 2400 | 0.0520 | 0.8817 | 0.9300 | 0.9817 | 0.9937 | 0.8328 | 0.9635 | 0.9880 | 0.7337 | 0.9235 |
| 0.0267 | 110.0 | 2420 | 0.0523 | 0.8814 | 0.9296 | 0.9816 | 0.9941 | 0.8328 | 0.9619 | 0.9879 | 0.7332 | 0.9232 |
| 0.0281 | 110.9091 | 2440 | 0.0527 | 0.8817 | 0.9318 | 0.9816 | 0.9939 | 0.8406 | 0.9609 | 0.9878 | 0.7346 | 0.9228 |
| 0.0329 | 111.8182 | 2460 | 0.0530 | 0.8816 | 0.9302 | 0.9816 | 0.9941 | 0.8353 | 0.9613 | 0.9877 | 0.7342 | 0.9229 |
| 0.0374 | 112.7273 | 2480 | 0.0530 | 0.8812 | 0.9300 | 0.9815 | 0.9936 | 0.8333 | 0.9631 | 0.9877 | 0.7330 | 0.9228 |
| 0.0298 | 113.6364 | 2500 | 0.0530 | 0.8811 | 0.9290 | 0.9815 | 0.9941 | 0.8314 | 0.9617 | 0.9877 | 0.7330 | 0.9227 |
| 0.028 | 114.5455 | 2520 | 0.0529 | 0.8812 | 0.9286 | 0.9816 | 0.9936 | 0.8280 | 0.9642 | 0.9878 | 0.7327 | 0.9231 |
| 0.0292 | 115.4545 | 2540 | 0.0535 | 0.8811 | 0.9293 | 0.9815 | 0.9932 | 0.8295 | 0.9651 | 0.9877 | 0.7327 | 0.9230 |
| 0.0326 | 116.3636 | 2560 | 0.0530 | 0.8814 | 0.9305 | 0.9815 | 0.9939 | 0.8361 | 0.9616 | 0.9877 | 0.7339 | 0.9228 |
| 0.0312 | 117.2727 | 2580 | 0.0532 | 0.8814 | 0.9306 | 0.9815 | 0.9937 | 0.8359 | 0.9623 | 0.9877 | 0.7338 | 0.9227 |
| 0.0292 | 118.1818 | 2600 | 0.0533 | 0.8811 | 0.9291 | 0.9815 | 0.9937 | 0.8303 | 0.9633 | 0.9877 | 0.7328 | 0.9228 |
| 0.0287 | 119.0909 | 2620 | 0.0533 | 0.8812 | 0.9306 | 0.9815 | 0.9937 | 0.8362 | 0.9619 | 0.9877 | 0.7333 | 0.9226 |
| 0.0352 | 120.0 | 2640 | 0.0528 | 0.8813 | 0.9318 | 0.9815 | 0.9940 | 0.8416 | 0.9598 | 0.9877 | 0.7339 | 0.9224 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"necrosis",
"root"
] |
EPFL-ECEO/segformer-b5-finetuned-coralscapes-1024-1024 |
# Model Card for Model ID
SegFormer model with a MiT-B5 backbone fine-tuned on Coralscapes at resolution 1024x1024, as introduced in [The Coralscapes Dataset: Semantic Scene Understanding in Coral Reefs](https://arxiv.org/abs/2503.20000).
## Model Details
### Model Description
- **Model type:** SegFormer
- **Finetuned from model:** [SegFormer (b5-sized) encoder pre-trained-only (`nvidia/mit-b5`)](https://huggingface.co/nvidia/mit-b5)
### Model Sources
- **Repository:** [coralscapesScripts](https://github.com/eceo-epfl/coralscapesScripts/)
- **Demo** [Hugging Face Spaces](https://huggingface.co/spaces/EPFL-ECEO/coralscapes_demo):
## How to Get Started with the Model
The simplest way to use this model to segment an image of the Coralscapes dataset is as follows:
```python
from transformers import SegformerImageProcessor, SegformerForSemanticSegmentation
from PIL import Image
from datasets import load_dataset
# Load an image from the coralscapes dataset or load your own image
dataset = load_dataset("EPFL-ECEO/coralscapes")
image = dataset["test"][42]["image"]
preprocessor = SegformerImageProcessor.from_pretrained("EPFL-ECEO/segformer-b5-finetuned-coralscapes-1024-1024")
model = SegformerForSemanticSegmentation.from_pretrained("EPFL-ECEO/segformer-b5-finetuned-coralscapes-1024-1024")
inputs = preprocessor(image, return_tensors = "pt")
outputs = model(**inputs)
outputs = preprocessor.post_process_semantic_segmentation(outputs, target_sizes=[(image.size[1], image.size[0])])
label_pred = outputs[0].numpy()
```
While using the above approach should still work for images of different sizes and scales, for images that are not close to the training size of the model (1024x1024),
we recommend using the following approach using a sliding window to achieve better results:
```python
import torch
import torch.nn.functional as F
from transformers import SegformerImageProcessor, SegformerForSemanticSegmentation
from PIL import Image
import numpy as np
from datasets import load_dataset
device = 'cuda' if torch.cuda.is_available() else 'cpu'
def resize_image(image, target_size=1024):
"""
Used to resize the image such that the smaller side equals 1024
"""
h_img, w_img = image.size
if h_img < w_img:
new_h, new_w = target_size, int(w_img * (target_size / h_img))
else:
new_h, new_w = int(h_img * (target_size / w_img)), target_size
resized_img = image.resize((new_h, new_w))
return resized_img
def segment_image(image, preprocessor, model, crop_size = (1024, 1024), num_classes = 40, transform=None):
"""
Finds an optimal stride based on the image size and aspect ratio to create
overlapping sliding windows of size 1024x1024 which are then fed into the model.
"""
h_crop, w_crop = crop_size
img = torch.Tensor(np.array(resize_image(image, target_size=1024)).transpose(2, 0, 1)).unsqueeze(0)
batch_size, _, h_img, w_img = img.size()
if transform:
img = torch.Tensor(transform(image = img.numpy())["image"]).to(device)
h_grids = int(np.round(3/2*h_img/h_crop)) if h_img > h_crop else 1
w_grids = int(np.round(3/2*w_img/w_crop)) if w_img > w_crop else 1
h_stride = int((h_img - h_crop + h_grids -1)/(h_grids -1)) if h_grids > 1 else h_crop
w_stride = int((w_img - w_crop + w_grids -1)/(w_grids -1)) if w_grids > 1 else w_crop
preds = img.new_zeros((batch_size, num_classes, h_img, w_img))
count_mat = img.new_zeros((batch_size, 1, h_img, w_img))
for h_idx in range(h_grids):
for w_idx in range(w_grids):
y1 = h_idx * h_stride
x1 = w_idx * w_stride
y2 = min(y1 + h_crop, h_img)
x2 = min(x1 + w_crop, w_img)
y1 = max(y2 - h_crop, 0)
x1 = max(x2 - w_crop, 0)
crop_img = img[:, :, y1:y2, x1:x2]
with torch.no_grad():
if(preprocessor):
inputs = preprocessor(crop_img, return_tensors = "pt")
inputs["pixel_values"] = inputs["pixel_values"].to(device)
else:
inputs = crop_img.to(device)
outputs = model(**inputs)
resized_logits = F.interpolate(
outputs.logits[0].unsqueeze(dim=0), size=crop_img.shape[-2:], mode="bilinear", align_corners=False
)
preds += F.pad(resized_logits,
(int(x1), int(preds.shape[3] - x2), int(y1),
int(preds.shape[2] - y2))).cpu()
count_mat[:, :, y1:y2, x1:x2] += 1
assert (count_mat == 0).sum() == 0
preds = preds / count_mat
preds = preds.argmax(dim=1)
preds = F.interpolate(preds.unsqueeze(0).type(torch.uint8), size=image.size[::-1], mode='nearest')
label_pred = preds.squeeze().cpu().numpy()
return label_pred
# Load an image from the coralscapes dataset or load your own image
dataset = load_dataset("EPFL-ECEO/coralscapes")
image = dataset["test"][42]["image"]
preprocessor = SegformerImageProcessor.from_pretrained("EPFL-ECEO/segformer-b5-finetuned-coralscapes-1024-1024")
model = SegformerForSemanticSegmentation.from_pretrained("EPFL-ECEO/segformer-b5-finetuned-coralscapes-1024-1024")
label_pred = segment_image(image, preprocessor, model)
```
## Training & Evaluation Details
### Data
The model is trained and evaluated on the [Coralscapes dataset](https://huggingface.co/datasets/EPFL-ECEO/coralscapes) which is a general-purpose dense semantic segmentation dataset for coral reefs.
### Procedure
Training mostly follows the Segformer original [implementation](https://proceedings.neurips.cc/paper_files/paper/2021/file/64f1f27bf1b4ec22924fd0acb550c235-Paper.pdf), outside of using stronger augmentations during training.
We used a batch size of 4 for 100 epochs with the AdamW optimizer with an initial learning rate of 6e-5, weight decay of 1e-2 and polynomial learning rate scheduler with a power of 1.
During training, images are randomly scaled within a range of 1.02 and 2 with aspect ratio 3/4 to 4/3, include random rotations of up to 15 degrees (cropped to exclude non-image areas),
as well as random color jitters of contrast, saturation, and brightness between 0.8 and 1.2, and hue changes between -0.05 and 0.05.
Input images are normalized using the ImageNet mean and standard deviation. For evaluation, a non-overlapping sliding window strategy is employed,
using a window size of 1024x1024.
### Results
- Test Accuracy: 82.761
- Test Mean IoU: 57.800
## Citation
If you find this project useful, please consider citing:
```bibtex
@misc{sauder2025coralscapesdatasetsemanticscene,
title={The Coralscapes Dataset: Semantic Scene Understanding in Coral Reefs},
author={Jonathan Sauder and Viktor Domazetoski and Guilhem Banc-Prandi and Gabriela Perna and Anders Meibom and Devis Tuia},
year={2025},
eprint={2503.20000},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2503.20000},
}
``` | [
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"10",
"11",
"12",
"13",
"14",
"15",
"16",
"17",
"18",
"19",
"20",
"21",
"22",
"23",
"24",
"25",
"26",
"27",
"28",
"29",
"30",
"31",
"32",
"33",
"34",
"35",
"36",
"37",
"38",
"39"
] |
jhaberbe/segformer-b0-finetuned-lipid-droplets-v4 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-lipid-droplets-v4
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the jhaberbe/lipid-droplets-v4 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0579
- Mean Iou: 0.2161
- Mean Accuracy: 0.4321
- Overall Accuracy: 0.4321
- Accuracy Unlabeled: nan
- Accuracy Lipid: 0.4321
- Iou Unlabeled: 0.0
- Iou Lipid: 0.4321
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Lipid | Iou Unlabeled | Iou Lipid |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:-------------:|:---------:|
| 0.0683 | 5.0 | 20 | 0.4807 | 0.006 | 0.012 | 0.012 | nan | 0.012 | 0.0 | 0.012 |
| 0.0727 | 10.0 | 40 | 0.2153 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0324 | 15.0 | 60 | 0.4132 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0229 | 20.0 | 80 | 0.1490 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0314 | 25.0 | 100 | 0.4783 | 0.0097 | 0.0193 | 0.0193 | nan | 0.0193 | 0.0 | 0.0193 |
| 0.0187 | 30.0 | 120 | 0.0686 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0326 | 35.0 | 140 | 0.0456 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.024 | 40.0 | 160 | 0.2099 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0162 | 45.0 | 180 | 0.1570 | 0.0009 | 0.0018 | 0.0018 | nan | 0.0018 | 0.0 | 0.0018 |
| 0.0129 | 50.0 | 200 | 0.1300 | 0.0101 | 0.0201 | 0.0201 | nan | 0.0201 | 0.0 | 0.0201 |
| 0.013 | 55.0 | 220 | 0.1372 | 0.0044 | 0.0088 | 0.0088 | nan | 0.0088 | 0.0 | 0.0088 |
| 0.0147 | 60.0 | 240 | 0.1529 | 0.0484 | 0.0968 | 0.0968 | nan | 0.0968 | 0.0 | 0.0968 |
| 0.0192 | 65.0 | 260 | 0.0777 | 0.0094 | 0.0188 | 0.0188 | nan | 0.0188 | 0.0 | 0.0188 |
| 0.0105 | 70.0 | 280 | 0.1133 | 0.0073 | 0.0146 | 0.0146 | nan | 0.0146 | 0.0 | 0.0146 |
| 0.0086 | 75.0 | 300 | 0.1237 | 0.0039 | 0.0077 | 0.0077 | nan | 0.0077 | 0.0 | 0.0077 |
| 0.0151 | 80.0 | 320 | 0.0472 | 0.0001 | 0.0001 | 0.0001 | nan | 0.0001 | 0.0 | 0.0001 |
| 0.013 | 85.0 | 340 | 0.0373 | 0.0014 | 0.0028 | 0.0028 | nan | 0.0028 | 0.0 | 0.0028 |
| 0.053 | 90.0 | 360 | 0.0745 | 0.0036 | 0.0072 | 0.0072 | nan | 0.0072 | 0.0 | 0.0072 |
| 0.0117 | 95.0 | 380 | 0.0097 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.018 | 100.0 | 400 | 0.1363 | 0.1399 | 0.2797 | 0.2797 | nan | 0.2797 | 0.0 | 0.2797 |
| 0.0099 | 105.0 | 420 | 0.0888 | 0.0053 | 0.0106 | 0.0106 | nan | 0.0106 | 0.0 | 0.0106 |
| 0.0063 | 110.0 | 440 | 0.0889 | 0.0134 | 0.0269 | 0.0269 | nan | 0.0269 | 0.0 | 0.0269 |
| 0.0127 | 115.0 | 460 | 0.0389 | 0.0059 | 0.0119 | 0.0119 | nan | 0.0119 | 0.0 | 0.0119 |
| 0.0153 | 120.0 | 480 | 0.0579 | 0.0048 | 0.0095 | 0.0095 | nan | 0.0095 | 0.0 | 0.0095 |
| 0.0064 | 125.0 | 500 | 0.0589 | 0.0415 | 0.0830 | 0.0830 | nan | 0.0830 | 0.0 | 0.0830 |
| 0.0074 | 130.0 | 520 | 0.1361 | 0.0917 | 0.1833 | 0.1833 | nan | 0.1833 | 0.0 | 0.1833 |
| 0.0062 | 135.0 | 540 | 0.1212 | 0.1098 | 0.2196 | 0.2196 | nan | 0.2196 | 0.0 | 0.2196 |
| 0.0053 | 140.0 | 560 | 0.0814 | 0.0954 | 0.1908 | 0.1908 | nan | 0.1908 | 0.0 | 0.1908 |
| 0.0065 | 145.0 | 580 | 0.1418 | 0.1704 | 0.3408 | 0.3408 | nan | 0.3408 | 0.0 | 0.3408 |
| 0.0118 | 150.0 | 600 | 0.0541 | 0.0210 | 0.0419 | 0.0419 | nan | 0.0419 | 0.0 | 0.0419 |
| 0.0111 | 155.0 | 620 | 0.0801 | 0.0419 | 0.0839 | 0.0839 | nan | 0.0839 | 0.0 | 0.0839 |
| 0.0077 | 160.0 | 640 | 0.0628 | 0.2041 | 0.4081 | 0.4081 | nan | 0.4081 | 0.0 | 0.4081 |
| 0.0118 | 165.0 | 660 | 0.1001 | 0.1372 | 0.2743 | 0.2743 | nan | 0.2743 | 0.0 | 0.2743 |
| 0.0078 | 170.0 | 680 | 0.1254 | 0.1439 | 0.2879 | 0.2879 | nan | 0.2879 | 0.0 | 0.2879 |
| 0.0058 | 175.0 | 700 | 0.1119 | 0.2726 | 0.5452 | 0.5452 | nan | 0.5452 | 0.0 | 0.5452 |
| 0.0097 | 180.0 | 720 | 0.1608 | 0.3098 | 0.6196 | 0.6196 | nan | 0.6196 | 0.0 | 0.6196 |
| 0.0063 | 185.0 | 740 | 0.1617 | 0.2622 | 0.5244 | 0.5244 | nan | 0.5244 | 0.0 | 0.5244 |
| 0.0095 | 190.0 | 760 | 0.0526 | 0.0804 | 0.1608 | 0.1608 | nan | 0.1608 | 0.0 | 0.1608 |
| 0.0119 | 195.0 | 780 | 0.1438 | 0.2352 | 0.4703 | 0.4703 | nan | 0.4703 | 0.0 | 0.4703 |
| 0.0264 | 200.0 | 800 | 0.1563 | 0.3272 | 0.6545 | 0.6545 | nan | 0.6545 | 0.0 | 0.6545 |
| 0.0043 | 205.0 | 820 | 0.0605 | 0.2524 | 0.5048 | 0.5048 | nan | 0.5048 | 0.0 | 0.5048 |
| 0.0055 | 210.0 | 840 | 0.1585 | 0.3650 | 0.7299 | 0.7299 | nan | 0.7299 | 0.0 | 0.7299 |
| 0.009 | 215.0 | 860 | 0.1218 | 0.2845 | 0.5690 | 0.5690 | nan | 0.5690 | 0.0 | 0.5690 |
| 0.0295 | 220.0 | 880 | 0.5198 | 0.4981 | 0.9963 | 0.9963 | nan | 0.9963 | 0.0 | 0.9963 |
| 0.004 | 225.0 | 900 | 0.1908 | 0.2370 | 0.4741 | 0.4741 | nan | 0.4741 | 0.0 | 0.4741 |
| 0.0108 | 230.0 | 920 | 0.0077 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0034 | 235.0 | 940 | 0.1812 | 0.1838 | 0.3676 | 0.3676 | nan | 0.3676 | 0.0 | 0.3676 |
| 0.0058 | 240.0 | 960 | 0.1195 | 0.1750 | 0.3501 | 0.3501 | nan | 0.3501 | 0.0 | 0.3501 |
| 0.0076 | 245.0 | 980 | 0.1365 | 0.0957 | 0.1914 | 0.1914 | nan | 0.1914 | 0.0 | 0.1914 |
| 0.0056 | 250.0 | 1000 | 0.0098 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0088 | 255.0 | 1020 | 0.0514 | 0.1783 | 0.3567 | 0.3567 | nan | 0.3567 | 0.0 | 0.3567 |
| 0.0143 | 260.0 | 1040 | 0.0106 | 0.0002 | 0.0004 | 0.0004 | nan | 0.0004 | 0.0 | 0.0004 |
| 0.0052 | 265.0 | 1060 | 0.0749 | 0.0476 | 0.0952 | 0.0952 | nan | 0.0952 | 0.0 | 0.0952 |
| 0.0084 | 270.0 | 1080 | 0.1056 | 0.2186 | 0.4372 | 0.4372 | nan | 0.4372 | 0.0 | 0.4372 |
| 0.0084 | 275.0 | 1100 | 0.1106 | 0.0734 | 0.1469 | 0.1469 | nan | 0.1469 | 0.0 | 0.1469 |
| 0.0043 | 280.0 | 1120 | 0.1078 | 0.3212 | 0.6425 | 0.6425 | nan | 0.6425 | 0.0 | 0.6425 |
| 0.0079 | 285.0 | 1140 | 0.0610 | 0.2426 | 0.4851 | 0.4851 | nan | 0.4851 | 0.0 | 0.4851 |
| 0.0037 | 290.0 | 1160 | 0.0624 | 0.2492 | 0.4985 | 0.4985 | nan | 0.4985 | 0.0 | 0.4985 |
| 0.0084 | 295.0 | 1180 | 0.1833 | 0.2811 | 0.5622 | 0.5622 | nan | 0.5622 | 0.0 | 0.5622 |
| 0.0049 | 300.0 | 1200 | 0.0843 | 0.2817 | 0.5634 | 0.5634 | nan | 0.5634 | 0.0 | 0.5634 |
| 0.0064 | 305.0 | 1220 | 0.0625 | 0.1790 | 0.3581 | 0.3581 | nan | 0.3581 | 0.0 | 0.3581 |
| 0.011 | 310.0 | 1240 | 0.0653 | 0.1121 | 0.2241 | 0.2241 | nan | 0.2241 | 0.0 | 0.2241 |
| 0.0041 | 315.0 | 1260 | 0.0620 | 0.198 | 0.396 | 0.396 | nan | 0.396 | 0.0 | 0.396 |
| 0.0079 | 320.0 | 1280 | 0.0733 | 0.11 | 0.22 | 0.22 | nan | 0.22 | 0.0 | 0.22 |
| 0.0039 | 325.0 | 1300 | 0.2672 | 0.2041 | 0.4081 | 0.4081 | nan | 0.4081 | 0.0 | 0.4081 |
| 0.0081 | 330.0 | 1320 | 0.0231 | 0.0708 | 0.1415 | 0.1415 | nan | 0.1415 | 0.0 | 0.1415 |
| 0.0038 | 335.0 | 1340 | 0.0244 | 0.1270 | 0.2541 | 0.2541 | nan | 0.2541 | 0.0 | 0.2541 |
| 0.0078 | 340.0 | 1360 | 0.0095 | 0.0198 | 0.0396 | 0.0396 | nan | 0.0396 | 0.0 | 0.0396 |
| 0.0061 | 345.0 | 1380 | 0.1181 | 0.1386 | 0.2772 | 0.2772 | nan | 0.2772 | 0.0 | 0.2772 |
| 0.0099 | 350.0 | 1400 | 0.0350 | 0.0970 | 0.1939 | 0.1939 | nan | 0.1939 | 0.0 | 0.1939 |
| 0.0066 | 355.0 | 1420 | 0.0303 | 0.1477 | 0.2954 | 0.2954 | nan | 0.2954 | 0.0 | 0.2954 |
| 0.0083 | 360.0 | 1440 | 0.0919 | 0.1237 | 0.2474 | 0.2474 | nan | 0.2474 | 0.0 | 0.2474 |
| 0.0082 | 365.0 | 1460 | 0.0087 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.0078 | 370.0 | 1480 | 0.1443 | 0.3666 | 0.7331 | 0.7331 | nan | 0.7331 | 0.0 | 0.7331 |
| 0.0075 | 375.0 | 1500 | 0.0493 | 0.038 | 0.076 | 0.076 | nan | 0.076 | 0.0 | 0.076 |
| 0.0058 | 380.0 | 1520 | 0.0941 | 0.2978 | 0.5956 | 0.5956 | nan | 0.5956 | 0.0 | 0.5956 |
| 0.0045 | 385.0 | 1540 | 0.0353 | 0.1239 | 0.2479 | 0.2479 | nan | 0.2479 | 0.0 | 0.2479 |
| 0.0077 | 390.0 | 1560 | 0.0469 | 0.1066 | 0.2131 | 0.2131 | nan | 0.2131 | 0.0 | 0.2131 |
| 0.0057 | 395.0 | 1580 | 0.0533 | 0.1192 | 0.2383 | 0.2383 | nan | 0.2383 | 0.0 | 0.2383 |
| 0.0073 | 400.0 | 1600 | 0.1126 | 0.1331 | 0.2662 | 0.2662 | nan | 0.2662 | 0.0 | 0.2662 |
| 0.0042 | 405.0 | 1620 | 0.1222 | 0.2205 | 0.4410 | 0.4410 | nan | 0.4410 | 0.0 | 0.4410 |
| 0.0078 | 410.0 | 1640 | 0.1111 | 0.2519 | 0.5039 | 0.5039 | nan | 0.5039 | 0.0 | 0.5039 |
| 0.0056 | 415.0 | 1660 | 0.0988 | 0.2311 | 0.4622 | 0.4622 | nan | 0.4622 | 0.0 | 0.4622 |
| 0.0093 | 420.0 | 1680 | 0.0464 | 0.0693 | 0.1386 | 0.1386 | nan | 0.1386 | 0.0 | 0.1386 |
| 0.0072 | 425.0 | 1700 | 0.0751 | 0.228 | 0.456 | 0.456 | nan | 0.456 | 0.0 | 0.456 |
| 0.0026 | 430.0 | 1720 | 0.0728 | 0.2397 | 0.4793 | 0.4793 | nan | 0.4793 | 0.0 | 0.4793 |
| 0.0051 | 435.0 | 1740 | 0.0443 | 0.0879 | 0.1757 | 0.1757 | nan | 0.1757 | 0.0 | 0.1757 |
| 0.0092 | 440.0 | 1760 | 0.0694 | 0.1514 | 0.3029 | 0.3029 | nan | 0.3029 | 0.0 | 0.3029 |
| 0.0079 | 445.0 | 1780 | 0.0287 | 0.1381 | 0.2761 | 0.2761 | nan | 0.2761 | 0.0 | 0.2761 |
| 0.0048 | 450.0 | 1800 | 0.0597 | 0.2146 | 0.4292 | 0.4292 | nan | 0.4292 | 0.0 | 0.4292 |
| 0.007 | 455.0 | 1820 | 0.1145 | 0.232 | 0.464 | 0.464 | nan | 0.464 | 0.0 | 0.464 |
| 0.0062 | 460.0 | 1840 | 0.0854 | 0.2715 | 0.5430 | 0.5430 | nan | 0.5430 | 0.0 | 0.5430 |
| 0.0031 | 465.0 | 1860 | 0.1299 | 0.2943 | 0.5886 | 0.5886 | nan | 0.5886 | 0.0 | 0.5886 |
| 0.0056 | 470.0 | 1880 | 0.0560 | 0.1650 | 0.3299 | 0.3299 | nan | 0.3299 | 0.0 | 0.3299 |
| 0.0031 | 475.0 | 1900 | 0.0706 | 0.1521 | 0.3043 | 0.3043 | nan | 0.3043 | 0.0 | 0.3043 |
| 0.0024 | 480.0 | 1920 | 0.0478 | 0.1862 | 0.3724 | 0.3724 | nan | 0.3724 | 0.0 | 0.3724 |
| 0.007 | 485.0 | 1940 | 0.0357 | 0.1796 | 0.3592 | 0.3592 | nan | 0.3592 | 0.0 | 0.3592 |
| 0.0075 | 490.0 | 1960 | 0.0598 | 0.1544 | 0.3088 | 0.3088 | nan | 0.3088 | 0.0 | 0.3088 |
| 0.0031 | 495.0 | 1980 | 0.0447 | 0.3236 | 0.6472 | 0.6472 | nan | 0.6472 | 0.0 | 0.6472 |
| 0.007 | 500.0 | 2000 | 0.1198 | 0.3270 | 0.6541 | 0.6541 | nan | 0.6541 | 0.0 | 0.6541 |
| 0.009 | 505.0 | 2020 | 0.0550 | 0.2481 | 0.4963 | 0.4963 | nan | 0.4963 | 0.0 | 0.4963 |
| 0.0042 | 510.0 | 2040 | 0.0649 | 0.2150 | 0.4299 | 0.4299 | nan | 0.4299 | 0.0 | 0.4299 |
| 0.0024 | 515.0 | 2060 | 0.0731 | 0.2039 | 0.4079 | 0.4079 | nan | 0.4079 | 0.0 | 0.4079 |
| 0.0034 | 520.0 | 2080 | 0.0812 | 0.2652 | 0.5303 | 0.5303 | nan | 0.5303 | 0.0 | 0.5303 |
| 0.0039 | 525.0 | 2100 | 0.0719 | 0.1894 | 0.3788 | 0.3788 | nan | 0.3788 | 0.0 | 0.3788 |
| 0.0061 | 530.0 | 2120 | 0.0943 | 0.2135 | 0.4270 | 0.4270 | nan | 0.4270 | 0.0 | 0.4270 |
| 0.0054 | 535.0 | 2140 | 0.0578 | 0.2337 | 0.4674 | 0.4674 | nan | 0.4674 | 0.0 | 0.4674 |
| 0.0074 | 540.0 | 2160 | 0.1498 | 0.4530 | 0.9059 | 0.9059 | nan | 0.9059 | 0.0 | 0.9059 |
| 0.0052 | 545.0 | 2180 | 0.1614 | 0.4556 | 0.9112 | 0.9112 | nan | 0.9112 | 0.0 | 0.9112 |
| 0.0068 | 550.0 | 2200 | 0.0747 | 0.3214 | 0.6428 | 0.6428 | nan | 0.6428 | 0.0 | 0.6428 |
| 0.0068 | 555.0 | 2220 | 0.0462 | 0.1623 | 0.3247 | 0.3247 | nan | 0.3247 | 0.0 | 0.3247 |
| 0.0097 | 560.0 | 2240 | 0.0748 | 0.3294 | 0.6588 | 0.6588 | nan | 0.6588 | 0.0 | 0.6588 |
| 0.0066 | 565.0 | 2260 | 0.0913 | 0.2101 | 0.4203 | 0.4203 | nan | 0.4203 | 0.0 | 0.4203 |
| 0.0054 | 570.0 | 2280 | 0.0472 | 0.1623 | 0.3247 | 0.3247 | nan | 0.3247 | 0.0 | 0.3247 |
| 0.0039 | 575.0 | 2300 | 0.0361 | 0.0861 | 0.1721 | 0.1721 | nan | 0.1721 | 0.0 | 0.1721 |
| 0.0023 | 580.0 | 2320 | 0.0547 | 0.1886 | 0.3772 | 0.3772 | nan | 0.3772 | 0.0 | 0.3772 |
| 0.0056 | 585.0 | 2340 | 0.0668 | 0.2911 | 0.5822 | 0.5822 | nan | 0.5822 | 0.0 | 0.5822 |
| 0.0055 | 590.0 | 2360 | 0.0511 | 0.2152 | 0.4303 | 0.4303 | nan | 0.4303 | 0.0 | 0.4303 |
| 0.0027 | 595.0 | 2380 | 0.0570 | 0.2193 | 0.4386 | 0.4386 | nan | 0.4386 | 0.0 | 0.4386 |
| 0.0046 | 600.0 | 2400 | 0.0373 | 0.0786 | 0.1572 | 0.1572 | nan | 0.1572 | 0.0 | 0.1572 |
| 0.0035 | 605.0 | 2420 | 0.0468 | 0.0870 | 0.1741 | 0.1741 | nan | 0.1741 | 0.0 | 0.1741 |
| 0.0022 | 610.0 | 2440 | 0.0476 | 0.1029 | 0.2058 | 0.2058 | nan | 0.2058 | 0.0 | 0.2058 |
| 0.0058 | 615.0 | 2460 | 0.0418 | 0.1003 | 0.2007 | 0.2007 | nan | 0.2007 | 0.0 | 0.2007 |
| 0.003 | 620.0 | 2480 | 0.0520 | 0.2134 | 0.4269 | 0.4269 | nan | 0.4269 | 0.0 | 0.4269 |
| 0.0072 | 625.0 | 2500 | 0.0475 | 0.198 | 0.396 | 0.396 | nan | 0.396 | 0.0 | 0.396 |
| 0.0054 | 630.0 | 2520 | 0.0371 | 0.1370 | 0.2741 | 0.2741 | nan | 0.2741 | 0.0 | 0.2741 |
| 0.0033 | 635.0 | 2540 | 0.0635 | 0.132 | 0.264 | 0.264 | nan | 0.264 | 0.0 | 0.264 |
| 0.0076 | 640.0 | 2560 | 0.0540 | 0.1278 | 0.2556 | 0.2556 | nan | 0.2556 | 0.0 | 0.2556 |
| 0.0067 | 645.0 | 2580 | 0.0339 | 0.128 | 0.256 | 0.256 | nan | 0.256 | 0.0 | 0.256 |
| 0.0032 | 650.0 | 2600 | 0.0399 | 0.1341 | 0.2683 | 0.2683 | nan | 0.2683 | 0.0 | 0.2683 |
| 0.0024 | 655.0 | 2620 | 0.0467 | 0.108 | 0.216 | 0.216 | nan | 0.216 | 0.0 | 0.216 |
| 0.0032 | 660.0 | 2640 | 0.0472 | 0.1653 | 0.3306 | 0.3306 | nan | 0.3306 | 0.0 | 0.3306 |
| 0.006 | 665.0 | 2660 | 0.0540 | 0.1963 | 0.3927 | 0.3927 | nan | 0.3927 | 0.0 | 0.3927 |
| 0.0021 | 670.0 | 2680 | 0.0532 | 0.1886 | 0.3771 | 0.3771 | nan | 0.3771 | 0.0 | 0.3771 |
| 0.0037 | 675.0 | 2700 | 0.0661 | 0.2588 | 0.5177 | 0.5177 | nan | 0.5177 | 0.0 | 0.5177 |
| 0.0073 | 680.0 | 2720 | 0.0514 | 0.1327 | 0.2654 | 0.2654 | nan | 0.2654 | 0.0 | 0.2654 |
| 0.0038 | 685.0 | 2740 | 0.0596 | 0.2090 | 0.4179 | 0.4179 | nan | 0.4179 | 0.0 | 0.4179 |
| 0.0038 | 690.0 | 2760 | 0.0421 | 0.1279 | 0.2557 | 0.2557 | nan | 0.2557 | 0.0 | 0.2557 |
| 0.0036 | 695.0 | 2780 | 0.0626 | 0.1976 | 0.3952 | 0.3952 | nan | 0.3952 | 0.0 | 0.3952 |
| 0.003 | 700.0 | 2800 | 0.0525 | 0.212 | 0.424 | 0.424 | nan | 0.424 | 0.0 | 0.424 |
| 0.0046 | 705.0 | 2820 | 0.0515 | 0.274 | 0.548 | 0.548 | nan | 0.548 | 0.0 | 0.548 |
| 0.004 | 710.0 | 2840 | 0.0691 | 0.2222 | 0.4444 | 0.4444 | nan | 0.4444 | 0.0 | 0.4444 |
| 0.0029 | 715.0 | 2860 | 0.0495 | 0.1559 | 0.3117 | 0.3117 | nan | 0.3117 | 0.0 | 0.3117 |
| 0.0057 | 720.0 | 2880 | 0.0442 | 0.1541 | 0.3081 | 0.3081 | nan | 0.3081 | 0.0 | 0.3081 |
| 0.007 | 725.0 | 2900 | 0.0835 | 0.1562 | 0.3124 | 0.3124 | nan | 0.3124 | 0.0 | 0.3124 |
| 0.0023 | 730.0 | 2920 | 0.0686 | 0.1466 | 0.2931 | 0.2931 | nan | 0.2931 | 0.0 | 0.2931 |
| 0.0054 | 735.0 | 2940 | 0.0813 | 0.1685 | 0.3370 | 0.3370 | nan | 0.3370 | 0.0 | 0.3370 |
| 0.0035 | 740.0 | 2960 | 0.0679 | 0.1623 | 0.3247 | 0.3247 | nan | 0.3247 | 0.0 | 0.3247 |
| 0.0035 | 745.0 | 2980 | 0.0535 | 0.1502 | 0.3004 | 0.3004 | nan | 0.3004 | 0.0 | 0.3004 |
| 0.003 | 750.0 | 3000 | 0.0645 | 0.1266 | 0.2531 | 0.2531 | nan | 0.2531 | 0.0 | 0.2531 |
| 0.0059 | 755.0 | 3020 | 0.0617 | 0.1535 | 0.3070 | 0.3070 | nan | 0.3070 | 0.0 | 0.3070 |
| 0.004 | 760.0 | 3040 | 0.0698 | 0.1315 | 0.2630 | 0.2630 | nan | 0.2630 | 0.0 | 0.2630 |
| 0.0032 | 765.0 | 3060 | 0.0515 | 0.1148 | 0.2297 | 0.2297 | nan | 0.2297 | 0.0 | 0.2297 |
| 0.0031 | 770.0 | 3080 | 0.0574 | 0.1309 | 0.2618 | 0.2618 | nan | 0.2618 | 0.0 | 0.2618 |
| 0.0039 | 775.0 | 3100 | 0.0480 | 0.1624 | 0.3248 | 0.3248 | nan | 0.3248 | 0.0 | 0.3248 |
| 0.0064 | 780.0 | 3120 | 0.0490 | 0.1434 | 0.2869 | 0.2869 | nan | 0.2869 | 0.0 | 0.2869 |
| 0.0056 | 785.0 | 3140 | 0.0492 | 0.1514 | 0.3028 | 0.3028 | nan | 0.3028 | 0.0 | 0.3028 |
| 0.0018 | 790.0 | 3160 | 0.0703 | 0.1704 | 0.3408 | 0.3408 | nan | 0.3408 | 0.0 | 0.3408 |
| 0.0015 | 795.0 | 3180 | 0.0819 | 0.2045 | 0.4090 | 0.4090 | nan | 0.4090 | 0.0 | 0.4090 |
| 0.0042 | 800.0 | 3200 | 0.0774 | 0.2131 | 0.4262 | 0.4262 | nan | 0.4262 | 0.0 | 0.4262 |
| 0.0037 | 805.0 | 3220 | 0.0510 | 0.1390 | 0.2779 | 0.2779 | nan | 0.2779 | 0.0 | 0.2779 |
| 0.0032 | 810.0 | 3240 | 0.0628 | 0.1786 | 0.3571 | 0.3571 | nan | 0.3571 | 0.0 | 0.3571 |
| 0.0026 | 815.0 | 3260 | 0.0778 | 0.1746 | 0.3492 | 0.3492 | nan | 0.3492 | 0.0 | 0.3492 |
| 0.0022 | 820.0 | 3280 | 0.0600 | 0.1619 | 0.3237 | 0.3237 | nan | 0.3237 | 0.0 | 0.3237 |
| 0.0042 | 825.0 | 3300 | 0.0691 | 0.1792 | 0.3583 | 0.3583 | nan | 0.3583 | 0.0 | 0.3583 |
| 0.0032 | 830.0 | 3320 | 0.0831 | 0.2368 | 0.4737 | 0.4737 | nan | 0.4737 | 0.0 | 0.4737 |
| 0.0041 | 835.0 | 3340 | 0.0627 | 0.1748 | 0.3495 | 0.3495 | nan | 0.3495 | 0.0 | 0.3495 |
| 0.0024 | 840.0 | 3360 | 0.0598 | 0.1503 | 0.3007 | 0.3007 | nan | 0.3007 | 0.0 | 0.3007 |
| 0.0017 | 845.0 | 3380 | 0.0719 | 0.1588 | 0.3175 | 0.3175 | nan | 0.3175 | 0.0 | 0.3175 |
| 0.0044 | 850.0 | 3400 | 0.0526 | 0.1526 | 0.3051 | 0.3051 | nan | 0.3051 | 0.0 | 0.3051 |
| 0.0059 | 855.0 | 3420 | 0.0610 | 0.17 | 0.34 | 0.34 | nan | 0.34 | 0.0 | 0.34 |
| 0.0046 | 860.0 | 3440 | 0.0628 | 0.1697 | 0.3394 | 0.3394 | nan | 0.3394 | 0.0 | 0.3394 |
| 0.0058 | 865.0 | 3460 | 0.0507 | 0.1936 | 0.3872 | 0.3872 | nan | 0.3872 | 0.0 | 0.3872 |
| 0.0049 | 870.0 | 3480 | 0.0482 | 0.1659 | 0.3319 | 0.3319 | nan | 0.3319 | 0.0 | 0.3319 |
| 0.003 | 875.0 | 3500 | 0.0692 | 0.2345 | 0.4690 | 0.4690 | nan | 0.4690 | 0.0 | 0.4690 |
| 0.0048 | 880.0 | 3520 | 0.0525 | 0.1594 | 0.3189 | 0.3189 | nan | 0.3189 | 0.0 | 0.3189 |
| 0.003 | 885.0 | 3540 | 0.0564 | 0.2008 | 0.4015 | 0.4015 | nan | 0.4015 | 0.0 | 0.4015 |
| 0.004 | 890.0 | 3560 | 0.0428 | 0.1561 | 0.3123 | 0.3123 | nan | 0.3123 | 0.0 | 0.3123 |
| 0.0018 | 895.0 | 3580 | 0.0454 | 0.144 | 0.288 | 0.288 | nan | 0.288 | 0.0 | 0.288 |
| 0.0023 | 900.0 | 3600 | 0.0460 | 0.1642 | 0.3284 | 0.3284 | nan | 0.3284 | 0.0 | 0.3284 |
| 0.0043 | 905.0 | 3620 | 0.0513 | 0.2052 | 0.4103 | 0.4103 | nan | 0.4103 | 0.0 | 0.4103 |
| 0.0013 | 910.0 | 3640 | 0.0502 | 0.1739 | 0.3479 | 0.3479 | nan | 0.3479 | 0.0 | 0.3479 |
| 0.0057 | 915.0 | 3660 | 0.0440 | 0.1687 | 0.3374 | 0.3374 | nan | 0.3374 | 0.0 | 0.3374 |
| 0.006 | 920.0 | 3680 | 0.0483 | 0.1517 | 0.3034 | 0.3034 | nan | 0.3034 | 0.0 | 0.3034 |
| 0.0029 | 925.0 | 3700 | 0.0443 | 0.1550 | 0.3101 | 0.3101 | nan | 0.3101 | 0.0 | 0.3101 |
| 0.0044 | 930.0 | 3720 | 0.0506 | 0.1661 | 0.3323 | 0.3323 | nan | 0.3323 | 0.0 | 0.3323 |
| 0.0016 | 935.0 | 3740 | 0.0612 | 0.1957 | 0.3914 | 0.3914 | nan | 0.3914 | 0.0 | 0.3914 |
| 0.0026 | 940.0 | 3760 | 0.0525 | 0.1766 | 0.3531 | 0.3531 | nan | 0.3531 | 0.0 | 0.3531 |
| 0.005 | 945.0 | 3780 | 0.0551 | 0.2171 | 0.4342 | 0.4342 | nan | 0.4342 | 0.0 | 0.4342 |
| 0.0035 | 950.0 | 3800 | 0.0554 | 0.1841 | 0.3683 | 0.3683 | nan | 0.3683 | 0.0 | 0.3683 |
| 0.003 | 955.0 | 3820 | 0.0271 | 0.1519 | 0.3037 | 0.3037 | nan | 0.3037 | 0.0 | 0.3037 |
| 0.0054 | 960.0 | 3840 | 0.0493 | 0.1688 | 0.3375 | 0.3375 | nan | 0.3375 | 0.0 | 0.3375 |
| 0.0031 | 965.0 | 3860 | 0.0518 | 0.1751 | 0.3502 | 0.3502 | nan | 0.3502 | 0.0 | 0.3502 |
| 0.005 | 970.0 | 3880 | 0.0569 | 0.1903 | 0.3807 | 0.3807 | nan | 0.3807 | 0.0 | 0.3807 |
| 0.0023 | 975.0 | 3900 | 0.0498 | 0.1952 | 0.3903 | 0.3903 | nan | 0.3903 | 0.0 | 0.3903 |
| 0.0023 | 980.0 | 3920 | 0.0581 | 0.2254 | 0.4508 | 0.4508 | nan | 0.4508 | 0.0 | 0.4508 |
| 0.0014 | 985.0 | 3940 | 0.0361 | 0.1692 | 0.3383 | 0.3383 | nan | 0.3383 | 0.0 | 0.3383 |
| 0.0028 | 990.0 | 3960 | 0.0524 | 0.2134 | 0.4268 | 0.4268 | nan | 0.4268 | 0.0 | 0.4268 |
| 0.0039 | 995.0 | 3980 | 0.0586 | 0.2223 | 0.4446 | 0.4446 | nan | 0.4446 | 0.0 | 0.4446 |
| 0.0055 | 1000.0 | 4000 | 0.0579 | 0.2161 | 0.4321 | 0.4321 | nan | 0.4321 | 0.0 | 0.4321 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
| [
"unlabeled",
"lipid"
] |
jhaberbe/segformer-b0-finetuned-lipid-droplets-v5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-lipid-droplets-v5
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the jhaberbe/lipid-droplets-v5 dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.3532
- eval_mean_iou: 0.0
- eval_mean_accuracy: 0.0
- eval_overall_accuracy: 0.0
- eval_accuracy_unlabeled: nan
- eval_accuracy_lipid: 0.0
- eval_accuracy_labeled_negative: 0.0
- eval_iou_unlabeled: 0.0
- eval_iou_lipid: 0.0
- eval_iou_labeled_negative: 0.0
- eval_runtime: 0.9807
- eval_samples_per_second: 8.158
- eval_steps_per_second: 1.02
- epoch: 440.0
- step: 440
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1000
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
| [
"unlabeled",
"lipid",
"labeled_negative"
] |
Dnq2025/mask2former-finetuned-ER-Mito-LD |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mask2former-finetuned-ER-Mito-LD
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Loss: 34.0481
- Dummy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 6
- eval_batch_size: 6
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 12900
### Training results
| Training Loss | Epoch | Step | Validation Loss | Dummy |
|:-------------:|:-----:|:-----:|:---------------:|:-----:|
| No log | 1.0 | 86 | 34.1100 | 1.0 |
| 43.7572 | 2.0 | 172 | 30.1643 | 1.0 |
| 32.0025 | 3.0 | 258 | 28.1818 | 1.0 |
| 26.8817 | 4.0 | 344 | 27.1780 | 1.0 |
| 24.1857 | 5.0 | 430 | 26.5827 | 1.0 |
| 22.9335 | 6.0 | 516 | 25.7310 | 1.0 |
| 21.3521 | 7.0 | 602 | 25.2442 | 1.0 |
| 21.3521 | 8.0 | 688 | 25.0511 | 1.0 |
| 20.4144 | 9.0 | 774 | 25.3838 | 1.0 |
| 18.8722 | 10.0 | 860 | 25.7261 | 1.0 |
| 18.576 | 11.0 | 946 | 25.0048 | 1.0 |
| 18.1119 | 12.0 | 1032 | 25.3630 | 1.0 |
| 17.8769 | 13.0 | 1118 | 25.2566 | 1.0 |
| 17.0204 | 14.0 | 1204 | 25.6023 | 1.0 |
| 17.0204 | 15.0 | 1290 | 26.3285 | 1.0 |
| 16.3528 | 16.0 | 1376 | 26.3254 | 1.0 |
| 16.5548 | 17.0 | 1462 | 26.9244 | 1.0 |
| 16.6848 | 18.0 | 1548 | 27.6294 | 1.0 |
| 15.4544 | 19.0 | 1634 | 25.7570 | 1.0 |
| 15.7209 | 20.0 | 1720 | 25.7097 | 1.0 |
| 15.3127 | 21.0 | 1806 | 27.2604 | 1.0 |
| 15.3127 | 22.0 | 1892 | 26.4286 | 1.0 |
| 14.9528 | 23.0 | 1978 | 27.5768 | 1.0 |
| 15.1795 | 24.0 | 2064 | 26.4714 | 1.0 |
| 14.707 | 25.0 | 2150 | 28.0977 | 1.0 |
| 14.3456 | 26.0 | 2236 | 26.7914 | 1.0 |
| 14.4534 | 27.0 | 2322 | 27.4079 | 1.0 |
| 14.4448 | 28.0 | 2408 | 26.8291 | 1.0 |
| 14.4448 | 29.0 | 2494 | 27.1506 | 1.0 |
| 14.0327 | 30.0 | 2580 | 27.1973 | 1.0 |
| 13.8785 | 31.0 | 2666 | 27.5062 | 1.0 |
| 14.3373 | 32.0 | 2752 | 27.9510 | 1.0 |
| 13.3176 | 33.0 | 2838 | 27.1878 | 1.0 |
| 13.8154 | 34.0 | 2924 | 25.5759 | 1.0 |
| 13.8962 | 35.0 | 3010 | 27.7627 | 1.0 |
| 13.8962 | 36.0 | 3096 | 28.8061 | 1.0 |
| 13.3858 | 37.0 | 3182 | 28.4328 | 1.0 |
| 12.9659 | 38.0 | 3268 | 27.5515 | 1.0 |
| 13.6813 | 39.0 | 3354 | 27.8206 | 1.0 |
| 13.3049 | 40.0 | 3440 | 28.6062 | 1.0 |
| 13.1584 | 41.0 | 3526 | 28.5364 | 1.0 |
| 12.9234 | 42.0 | 3612 | 29.3165 | 1.0 |
| 12.9234 | 43.0 | 3698 | 28.5156 | 1.0 |
| 13.1375 | 44.0 | 3784 | 28.2476 | 1.0 |
| 12.7875 | 45.0 | 3870 | 29.9959 | 1.0 |
| 12.6507 | 46.0 | 3956 | 28.5480 | 1.0 |
| 13.0131 | 47.0 | 4042 | 29.1117 | 1.0 |
| 12.3806 | 48.0 | 4128 | 31.2153 | 1.0 |
| 12.9016 | 49.0 | 4214 | 28.9405 | 1.0 |
| 12.274 | 50.0 | 4300 | 28.7396 | 1.0 |
| 12.274 | 51.0 | 4386 | 30.3948 | 1.0 |
| 12.5767 | 52.0 | 4472 | 29.3863 | 1.0 |
| 12.5965 | 53.0 | 4558 | 29.4516 | 1.0 |
| 11.9685 | 54.0 | 4644 | 27.2974 | 1.0 |
| 12.3025 | 55.0 | 4730 | 27.0013 | 1.0 |
| 12.4256 | 56.0 | 4816 | 27.2713 | 1.0 |
| 12.2008 | 57.0 | 4902 | 27.4054 | 1.0 |
| 12.2008 | 58.0 | 4988 | 27.9546 | 1.0 |
| 12.1018 | 59.0 | 5074 | 28.9453 | 1.0 |
| 12.2156 | 60.0 | 5160 | 29.3121 | 1.0 |
| 11.9526 | 61.0 | 5246 | 30.1903 | 1.0 |
| 12.1103 | 62.0 | 5332 | 28.8276 | 1.0 |
| 11.8017 | 63.0 | 5418 | 28.7898 | 1.0 |
| 11.9907 | 64.0 | 5504 | 28.6167 | 1.0 |
| 11.9907 | 65.0 | 5590 | 29.2822 | 1.0 |
| 11.6683 | 66.0 | 5676 | 31.4695 | 1.0 |
| 12.1544 | 67.0 | 5762 | 27.7773 | 1.0 |
| 11.7442 | 68.0 | 5848 | 29.5376 | 1.0 |
| 11.1493 | 69.0 | 5934 | 27.8916 | 1.0 |
| 12.0781 | 70.0 | 6020 | 28.4096 | 1.0 |
| 11.8055 | 71.0 | 6106 | 29.2272 | 1.0 |
| 11.8055 | 72.0 | 6192 | 29.2769 | 1.0 |
| 11.4811 | 73.0 | 6278 | 29.2552 | 1.0 |
| 11.5947 | 74.0 | 6364 | 29.2611 | 1.0 |
| 11.7263 | 75.0 | 6450 | 30.7953 | 1.0 |
| 11.7399 | 76.0 | 6536 | 30.0692 | 1.0 |
| 11.0851 | 77.0 | 6622 | 29.6803 | 1.0 |
| 11.5118 | 78.0 | 6708 | 30.7345 | 1.0 |
| 11.5118 | 79.0 | 6794 | 31.5980 | 1.0 |
| 11.516 | 80.0 | 6880 | 30.5279 | 1.0 |
| 11.3797 | 81.0 | 6966 | 30.2265 | 1.0 |
| 11.3335 | 82.0 | 7052 | 30.3816 | 1.0 |
| 11.2303 | 83.0 | 7138 | 29.3238 | 1.0 |
| 11.1964 | 84.0 | 7224 | 30.3987 | 1.0 |
| 11.321 | 85.0 | 7310 | 30.1935 | 1.0 |
| 11.321 | 86.0 | 7396 | 29.1421 | 1.0 |
| 11.3891 | 87.0 | 7482 | 31.2074 | 1.0 |
| 11.1347 | 88.0 | 7568 | 30.6735 | 1.0 |
| 11.1945 | 89.0 | 7654 | 31.2053 | 1.0 |
| 10.9891 | 90.0 | 7740 | 31.4373 | 1.0 |
| 11.104 | 91.0 | 7826 | 31.3946 | 1.0 |
| 11.1408 | 92.0 | 7912 | 31.2186 | 1.0 |
| 11.1408 | 93.0 | 7998 | 29.5871 | 1.0 |
| 11.0779 | 94.0 | 8084 | 30.4671 | 1.0 |
| 11.0551 | 95.0 | 8170 | 32.0130 | 1.0 |
| 10.8809 | 96.0 | 8256 | 30.4459 | 1.0 |
| 11.1123 | 97.0 | 8342 | 30.8415 | 1.0 |
| 10.7116 | 98.0 | 8428 | 31.0445 | 1.0 |
| 11.0086 | 99.0 | 8514 | 31.0471 | 1.0 |
| 11.0542 | 100.0 | 8600 | 31.0217 | 1.0 |
| 11.0542 | 101.0 | 8686 | 31.7885 | 1.0 |
| 10.8332 | 102.0 | 8772 | 30.6191 | 1.0 |
| 10.8696 | 103.0 | 8858 | 31.2075 | 1.0 |
| 10.6959 | 104.0 | 8944 | 32.0795 | 1.0 |
| 11.0688 | 105.0 | 9030 | 33.7820 | 1.0 |
| 10.6762 | 106.0 | 9116 | 31.9403 | 1.0 |
| 10.8607 | 107.0 | 9202 | 33.1345 | 1.0 |
| 10.8607 | 108.0 | 9288 | 31.0811 | 1.0 |
| 10.7504 | 109.0 | 9374 | 31.0663 | 1.0 |
| 10.7841 | 110.0 | 9460 | 30.0841 | 1.0 |
| 10.5677 | 111.0 | 9546 | 30.8185 | 1.0 |
| 11.0266 | 112.0 | 9632 | 32.1549 | 1.0 |
| 10.5912 | 113.0 | 9718 | 32.2208 | 1.0 |
| 10.6698 | 114.0 | 9804 | 31.5337 | 1.0 |
| 10.6698 | 115.0 | 9890 | 32.2273 | 1.0 |
| 10.6857 | 116.0 | 9976 | 31.8648 | 1.0 |
| 10.5977 | 117.0 | 10062 | 31.8058 | 1.0 |
| 10.6883 | 118.0 | 10148 | 31.7254 | 1.0 |
| 10.3506 | 119.0 | 10234 | 33.0298 | 1.0 |
| 10.9217 | 120.0 | 10320 | 33.3403 | 1.0 |
| 10.5332 | 121.0 | 10406 | 32.5384 | 1.0 |
| 10.5332 | 122.0 | 10492 | 32.2192 | 1.0 |
| 10.4658 | 123.0 | 10578 | 32.8913 | 1.0 |
| 10.4877 | 124.0 | 10664 | 33.1068 | 1.0 |
| 10.7404 | 125.0 | 10750 | 34.1187 | 1.0 |
| 10.2195 | 126.0 | 10836 | 32.4418 | 1.0 |
| 10.7622 | 127.0 | 10922 | 32.2935 | 1.0 |
| 10.4301 | 128.0 | 11008 | 33.2411 | 1.0 |
| 10.4301 | 129.0 | 11094 | 32.3692 | 1.0 |
| 10.6464 | 130.0 | 11180 | 32.6297 | 1.0 |
| 10.4213 | 131.0 | 11266 | 33.7513 | 1.0 |
| 10.382 | 132.0 | 11352 | 32.6382 | 1.0 |
| 10.6049 | 133.0 | 11438 | 33.2621 | 1.0 |
| 10.3039 | 134.0 | 11524 | 32.9468 | 1.0 |
| 10.3088 | 135.0 | 11610 | 33.4821 | 1.0 |
| 10.3088 | 136.0 | 11696 | 33.4824 | 1.0 |
| 10.4832 | 137.0 | 11782 | 32.9320 | 1.0 |
| 10.4149 | 138.0 | 11868 | 33.8853 | 1.0 |
| 10.2473 | 139.0 | 11954 | 33.5977 | 1.0 |
| 10.7137 | 140.0 | 12040 | 34.1817 | 1.0 |
| 10.2686 | 141.0 | 12126 | 34.0892 | 1.0 |
| 10.2581 | 142.0 | 12212 | 34.1113 | 1.0 |
| 10.2581 | 143.0 | 12298 | 33.9106 | 1.0 |
| 10.447 | 144.0 | 12384 | 33.3470 | 1.0 |
| 10.3823 | 145.0 | 12470 | 33.3055 | 1.0 |
| 10.1283 | 146.0 | 12556 | 33.6762 | 1.0 |
| 10.5364 | 147.0 | 12642 | 33.9977 | 1.0 |
| 10.1257 | 148.0 | 12728 | 34.0327 | 1.0 |
| 10.3092 | 149.0 | 12814 | 34.1170 | 1.0 |
| 10.4947 | 150.0 | 12900 | 33.8776 | 1.0 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
Dnq2025/mask2former-finetuned-ER-Mito-LD3 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mask2former-finetuned-ER-Mito-LD3
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Loss: 39.9236
- Dummy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 12900
### Training results
| Training Loss | Epoch | Step | Validation Loss | Dummy |
|:-------------:|:-----:|:-----:|:---------------:|:-----:|
| 57.794 | 1.0 | 129 | 47.5030 | 1.0 |
| 45.635 | 2.0 | 258 | 42.4622 | 1.0 |
| 43.6742 | 3.0 | 387 | 40.1383 | 1.0 |
| 37.5286 | 4.0 | 516 | 41.1964 | 1.0 |
| 33.7618 | 5.0 | 645 | 34.4374 | 1.0 |
| 31.5899 | 6.0 | 774 | 39.8242 | 1.0 |
| 29.0727 | 7.0 | 903 | 33.3223 | 1.0 |
| 27.8483 | 8.0 | 1032 | 30.9625 | 1.0 |
| 26.0904 | 9.0 | 1161 | 31.7084 | 1.0 |
| 26.1043 | 10.0 | 1290 | 31.8088 | 1.0 |
| 24.3038 | 11.0 | 1419 | 30.3361 | 1.0 |
| 23.6493 | 12.0 | 1548 | 30.2030 | 1.0 |
| 23.9146 | 13.0 | 1677 | 31.0806 | 1.0 |
| 21.9133 | 14.0 | 1806 | 31.3974 | 1.0 |
| 22.3071 | 15.0 | 1935 | 32.0925 | 1.0 |
| 21.0819 | 16.0 | 2064 | 29.9367 | 1.0 |
| 21.0089 | 17.0 | 2193 | 30.0420 | 1.0 |
| 20.9169 | 18.0 | 2322 | 29.2938 | 1.0 |
| 19.7935 | 19.0 | 2451 | 31.3945 | 1.0 |
| 19.8749 | 20.0 | 2580 | 29.8457 | 1.0 |
| 19.2973 | 21.0 | 2709 | 29.0713 | 1.0 |
| 18.5436 | 22.0 | 2838 | 29.0846 | 1.0 |
| 18.5996 | 23.0 | 2967 | 29.8810 | 1.0 |
| 19.1228 | 24.0 | 3096 | 29.3016 | 1.0 |
| 18.0519 | 25.0 | 3225 | 30.7155 | 1.0 |
| 17.7073 | 26.0 | 3354 | 28.7168 | 1.0 |
| 17.5055 | 27.0 | 3483 | 28.9899 | 1.0 |
| 17.4854 | 28.0 | 3612 | 30.1944 | 1.0 |
| 17.0048 | 29.0 | 3741 | 29.2829 | 1.0 |
| 16.8731 | 30.0 | 3870 | 30.1208 | 1.0 |
| 16.683 | 31.0 | 3999 | 30.7583 | 1.0 |
| 16.6109 | 32.0 | 4128 | 30.6232 | 1.0 |
| 15.8261 | 33.0 | 4257 | 29.4162 | 1.0 |
| 16.9002 | 34.0 | 4386 | 30.4388 | 1.0 |
| 16.3081 | 35.0 | 4515 | 29.9756 | 1.0 |
| 15.4745 | 36.0 | 4644 | 28.8214 | 1.0 |
| 15.938 | 37.0 | 4773 | 29.1001 | 1.0 |
| 15.9947 | 38.0 | 4902 | 31.0533 | 1.0 |
| 15.2328 | 39.0 | 5031 | 31.6211 | 1.0 |
| 15.202 | 40.0 | 5160 | 33.1383 | 1.0 |
| 15.0583 | 41.0 | 5289 | 31.4089 | 1.0 |
| 14.573 | 42.0 | 5418 | 31.5681 | 1.0 |
| 14.7401 | 43.0 | 5547 | 30.5548 | 1.0 |
| 14.6052 | 44.0 | 5676 | 31.3953 | 1.0 |
| 14.1299 | 45.0 | 5805 | 30.8153 | 1.0 |
| 13.6851 | 46.0 | 5934 | 30.9693 | 1.0 |
| 14.6677 | 47.0 | 6063 | 31.9361 | 1.0 |
| 13.6493 | 48.0 | 6192 | 34.3328 | 1.0 |
| 14.166 | 49.0 | 6321 | 32.6231 | 1.0 |
| 13.7388 | 50.0 | 6450 | 33.1736 | 1.0 |
| 13.0849 | 51.0 | 6579 | 34.9522 | 1.0 |
| 13.2502 | 52.0 | 6708 | 35.7990 | 1.0 |
| 13.5116 | 53.0 | 6837 | 31.5737 | 1.0 |
| 12.6993 | 54.0 | 6966 | 33.2650 | 1.0 |
| 13.3602 | 55.0 | 7095 | 34.8914 | 1.0 |
| 12.9585 | 56.0 | 7224 | 35.9862 | 1.0 |
| 12.7434 | 57.0 | 7353 | 34.9106 | 1.0 |
| 12.7299 | 58.0 | 7482 | 34.0106 | 1.0 |
| 12.717 | 59.0 | 7611 | 36.3588 | 1.0 |
| 12.0563 | 60.0 | 7740 | 35.0923 | 1.0 |
| 13.012 | 61.0 | 7869 | 38.7323 | 1.0 |
| 12.2878 | 62.0 | 7998 | 34.9967 | 1.0 |
| 12.2794 | 63.0 | 8127 | 37.5577 | 1.0 |
| 12.4147 | 64.0 | 8256 | 37.2733 | 1.0 |
| 12.0032 | 65.0 | 8385 | 35.3015 | 1.0 |
| 12.2793 | 66.0 | 8514 | 35.2806 | 1.0 |
| 12.2309 | 67.0 | 8643 | 36.2488 | 1.0 |
| 11.7082 | 68.0 | 8772 | 35.6687 | 1.0 |
| 11.8694 | 69.0 | 8901 | 36.0470 | 1.0 |
| 11.782 | 70.0 | 9030 | 35.4055 | 1.0 |
| 11.6254 | 71.0 | 9159 | 36.7066 | 1.0 |
| 11.5873 | 72.0 | 9288 | 36.1084 | 1.0 |
| 11.6251 | 73.0 | 9417 | 38.2932 | 1.0 |
| 11.4589 | 74.0 | 9546 | 36.5570 | 1.0 |
| 11.7378 | 75.0 | 9675 | 35.9887 | 1.0 |
| 11.4933 | 76.0 | 9804 | 36.4713 | 1.0 |
| 11.2566 | 77.0 | 9933 | 36.9622 | 1.0 |
| 11.25 | 78.0 | 10062 | 37.1016 | 1.0 |
| 11.2962 | 79.0 | 10191 | 37.8711 | 1.0 |
| 11.0868 | 80.0 | 10320 | 38.5714 | 1.0 |
| 11.2786 | 81.0 | 10449 | 38.1493 | 1.0 |
| 11.1528 | 82.0 | 10578 | 39.0100 | 1.0 |
| 11.089 | 83.0 | 10707 | 38.5474 | 1.0 |
| 10.954 | 84.0 | 10836 | 38.9405 | 1.0 |
| 11.0157 | 85.0 | 10965 | 39.3872 | 1.0 |
| 10.9849 | 86.0 | 11094 | 39.4875 | 1.0 |
| 10.5423 | 87.0 | 11223 | 39.1179 | 1.0 |
| 11.1968 | 88.0 | 11352 | 39.4084 | 1.0 |
| 10.6376 | 89.0 | 11481 | 39.8218 | 1.0 |
| 10.7131 | 90.0 | 11610 | 39.2553 | 1.0 |
| 10.8252 | 91.0 | 11739 | 39.1368 | 1.0 |
| 10.6456 | 92.0 | 11868 | 38.9194 | 1.0 |
| 10.8488 | 93.0 | 11997 | 39.5955 | 1.0 |
| 10.8675 | 94.0 | 12126 | 39.4760 | 1.0 |
| 10.4757 | 95.0 | 12255 | 40.4844 | 1.0 |
| 10.3191 | 96.0 | 12384 | 39.0673 | 1.0 |
| 10.6073 | 97.0 | 12513 | 39.3767 | 1.0 |
| 10.3038 | 98.0 | 12642 | 39.6969 | 1.0 |
| 11.0709 | 99.0 | 12771 | 39.9325 | 1.0 |
| 10.5951 | 100.0 | 12900 | 39.8755 | 1.0 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
Dnq2025/mask2former-finetuned-ER-Mito-LD4 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mask2former-finetuned-ER-Mito-LD4
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Loss: 33.8611
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 6450
### Training results
| Training Loss | Epoch | Step | Dummy | Validation Loss |
|:-------------:|:-----:|:----:|:-----:|:---------------:|
| 53.0042 | 1.0 | 129 | 1.0 | 41.6702 |
| 41.5032 | 2.0 | 258 | 1.0 | 35.3152 |
| 37.8318 | 3.0 | 387 | 1.0 | 33.2929 |
| 33.1734 | 4.0 | 516 | 1.0 | 31.6052 |
| 31.0889 | 5.0 | 645 | 1.0 | 32.0792 |
| 30.5091 | 6.0 | 774 | 1.0 | 29.4252 |
| 27.7742 | 7.0 | 903 | 1.0 | 29.3660 |
| 27.1136 | 8.0 | 1032 | 1.0 | 28.6043 |
| 25.1614 | 9.0 | 1161 | 1.0 | 28.0848 |
| 24.7794 | 10.0 | 1290 | 1.0 | 28.1507 |
| 23.636 | 11.0 | 1419 | 1.0 | 28.3853 |
| 22.7494 | 12.0 | 1548 | 1.0 | 27.2592 |
| 22.7129 | 13.0 | 1677 | 1.0 | 29.8838 |
| 21.1747 | 14.0 | 1806 | 1.0 | 28.1624 |
| 20.9589 | 15.0 | 1935 | 1.0 | 27.9121 |
| 20.2591 | 16.0 | 2064 | 1.0 | 26.6467 |
| 20.1436 | 17.0 | 2193 | 1.0 | 26.9901 |
| 19.5047 | 18.0 | 2322 | 1.0 | 29.2895 |
| 18.4257 | 19.0 | 2451 | 1.0 | 27.0489 |
| 18.6316 | 20.0 | 2580 | 1.0 | 27.3730 |
| 18.037 | 21.0 | 2709 | 1.0 | 28.0853 |
| 17.6324 | 22.0 | 2838 | 1.0 | 26.6344 |
| 17.19 | 23.0 | 2967 | 1.0 | 28.1709 |
| 17.5784 | 24.0 | 3096 | 1.0 | 26.3646 |
| 16.3714 | 25.0 | 3225 | 1.0 | 28.6477 |
| 16.2177 | 26.0 | 3354 | 1.0 | 29.9328 |
| 15.8326 | 27.0 | 3483 | 1.0 | 27.1418 |
| 15.7345 | 28.0 | 3612 | 1.0 | 28.5265 |
| 14.918 | 29.0 | 3741 | 1.0 | 30.8378 |
| 15.2316 | 30.0 | 3870 | 1.0 | 28.5173 |
| 14.6576 | 31.0 | 3999 | 1.0 | 29.0688 |
| 14.5837 | 32.0 | 4128 | 1.0 | 29.7354 |
| 13.7819 | 33.0 | 4257 | 1.0 | 28.6140 |
| 14.851 | 34.0 | 4386 | 1.0 | 30.7131 |
| 14.1454 | 35.0 | 4515 | 1.0 | 29.3673 |
| 13.5445 | 36.0 | 4644 | 1.0 | 30.1412 |
| 13.3725 | 37.0 | 4773 | 1.0 | 29.7489 |
| 13.8976 | 38.0 | 4902 | 1.0 | 32.2482 |
| 13.2317 | 39.0 | 5031 | 1.0 | 33.3837 |
| 12.8382 | 40.0 | 5160 | 1.0 | 31.9261 |
| 12.8798 | 41.0 | 5289 | 1.0 | 31.0644 |
| 12.5615 | 42.0 | 5418 | 1.0 | 32.6052 |
| 12.4595 | 43.0 | 5547 | 1.0 | 32.6710 |
| 12.9861 | 44.0 | 5676 | 1.0 | 32.3271 |
| 12.3429 | 45.0 | 5805 | 1.0 | 33.1802 |
| 11.6031 | 46.0 | 5934 | 1.0 | 33.3981 |
| 12.7182 | 47.0 | 6063 | 1.0 | 33.2806 |
| 11.8251 | 48.0 | 6192 | 1.0 | 33.9491 |
| 12.4439 | 49.0 | 6321 | 1.0 | 33.4338 |
| 11.9834 | 50.0 | 6450 | 1.0 | 33.8444 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
BigR-Oclock/segformer-b0-crop-detection |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-crop-detection
This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the BigR-Oclock/CropSegmentation dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2364
- Mean Iou: 0.4754
- Mean Accuracy: 0.9509
- Overall Accuracy: 0.9509
- Accuracy Background: nan
- Accuracy Crop: 0.9509
- Iou Background: 0.0
- Iou Crop: 0.9509
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Crop | Iou Background | Iou Crop |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:--------------:|:--------:|
| 0.5159 | 0.1092 | 50 | 0.3885 | 0.4099 | 0.8197 | 0.8197 | nan | 0.8197 | 0.0 | 0.8197 |
| 0.3496 | 0.2183 | 100 | 0.2894 | 0.4077 | 0.8155 | 0.8155 | nan | 0.8155 | 0.0 | 0.8155 |
| 0.3076 | 0.3275 | 150 | 0.2679 | 0.4386 | 0.8773 | 0.8773 | nan | 0.8773 | 0.0 | 0.8773 |
| 0.2953 | 0.4367 | 200 | 0.2906 | 0.4444 | 0.8888 | 0.8888 | nan | 0.8888 | 0.0 | 0.8888 |
| 0.2322 | 0.5459 | 250 | 0.2511 | 0.3949 | 0.7898 | 0.7898 | nan | 0.7898 | 0.0 | 0.7898 |
| 0.2256 | 0.6550 | 300 | 0.2468 | 0.4529 | 0.9058 | 0.9058 | nan | 0.9058 | 0.0 | 0.9058 |
| 0.2706 | 0.7642 | 350 | 0.1816 | 0.4332 | 0.8663 | 0.8663 | nan | 0.8663 | 0.0 | 0.8663 |
| 0.1979 | 0.8734 | 400 | 0.2390 | 0.4521 | 0.9043 | 0.9043 | nan | 0.9043 | 0.0 | 0.9043 |
| 0.2527 | 0.9825 | 450 | 0.2981 | 0.3835 | 0.7670 | 0.7670 | nan | 0.7670 | 0.0 | 0.7670 |
| 0.1658 | 1.0917 | 500 | 0.1473 | 0.4537 | 0.9073 | 0.9073 | nan | 0.9073 | 0.0 | 0.9073 |
| 0.1866 | 1.2009 | 550 | 0.2338 | 0.4246 | 0.8492 | 0.8492 | nan | 0.8492 | 0.0 | 0.8492 |
| 0.1665 | 1.3100 | 600 | 0.1739 | 0.4639 | 0.9278 | 0.9278 | nan | 0.9278 | 0.0 | 0.9278 |
| 0.1692 | 1.4192 | 650 | 0.1808 | 0.4511 | 0.9022 | 0.9022 | nan | 0.9022 | 0.0 | 0.9022 |
| 0.1803 | 1.5284 | 700 | 0.2468 | 0.4138 | 0.8277 | 0.8277 | nan | 0.8277 | 0.0 | 0.8277 |
| 0.1722 | 1.6376 | 750 | 0.1914 | 0.4345 | 0.8691 | 0.8691 | nan | 0.8691 | 0.0 | 0.8691 |
| 0.1526 | 1.7467 | 800 | 0.2183 | 0.4396 | 0.8792 | 0.8792 | nan | 0.8792 | 0.0 | 0.8792 |
| 0.1409 | 1.8559 | 850 | 0.2273 | 0.4216 | 0.8433 | 0.8433 | nan | 0.8433 | 0.0 | 0.8433 |
| 0.169 | 1.9651 | 900 | 0.2728 | 0.4036 | 0.8072 | 0.8072 | nan | 0.8072 | 0.0 | 0.8072 |
| 0.1302 | 2.0742 | 950 | 0.2208 | 0.4452 | 0.8903 | 0.8903 | nan | 0.8903 | 0.0 | 0.8903 |
| 0.1268 | 2.1834 | 1000 | 0.2283 | 0.4253 | 0.8507 | 0.8507 | nan | 0.8507 | 0.0 | 0.8507 |
| 0.1271 | 2.2926 | 1050 | 0.1984 | 0.4506 | 0.9012 | 0.9012 | nan | 0.9012 | 0.0 | 0.9012 |
| 0.1321 | 2.4017 | 1100 | 0.1618 | 0.4560 | 0.9120 | 0.9120 | nan | 0.9120 | 0.0 | 0.9120 |
| 0.1345 | 2.5109 | 1150 | 0.1725 | 0.4659 | 0.9318 | 0.9318 | nan | 0.9318 | 0.0 | 0.9318 |
| 0.1053 | 2.6201 | 1200 | 0.1550 | 0.4574 | 0.9148 | 0.9148 | nan | 0.9148 | 0.0 | 0.9148 |
| 0.1245 | 2.7293 | 1250 | 0.1696 | 0.4816 | 0.9632 | 0.9632 | nan | 0.9632 | 0.0 | 0.9632 |
| 0.1104 | 2.8384 | 1300 | 0.2519 | 0.4330 | 0.8661 | 0.8661 | nan | 0.8661 | 0.0 | 0.8661 |
| 0.1105 | 2.9476 | 1350 | 0.1830 | 0.4655 | 0.9310 | 0.9310 | nan | 0.9310 | 0.0 | 0.9310 |
| 0.1215 | 3.0568 | 1400 | 0.2102 | 0.4596 | 0.9192 | 0.9192 | nan | 0.9192 | 0.0 | 0.9192 |
| 0.0995 | 3.1659 | 1450 | 0.2363 | 0.4478 | 0.8957 | 0.8957 | nan | 0.8957 | 0.0 | 0.8957 |
| 0.1115 | 3.2751 | 1500 | 0.1730 | 0.4717 | 0.9435 | 0.9435 | nan | 0.9435 | 0.0 | 0.9435 |
| 0.0998 | 3.3843 | 1550 | 0.2067 | 0.4535 | 0.9070 | 0.9070 | nan | 0.9070 | 0.0 | 0.9070 |
| 0.0963 | 3.4934 | 1600 | 0.2127 | 0.4701 | 0.9401 | 0.9401 | nan | 0.9401 | 0.0 | 0.9401 |
| 0.0985 | 3.6026 | 1650 | 0.1695 | 0.4686 | 0.9371 | 0.9371 | nan | 0.9371 | 0.0 | 0.9371 |
| 0.0822 | 3.7118 | 1700 | 0.2069 | 0.4494 | 0.8988 | 0.8988 | nan | 0.8988 | 0.0 | 0.8988 |
| 0.1065 | 3.8210 | 1750 | 0.2140 | 0.4590 | 0.9179 | 0.9179 | nan | 0.9179 | 0.0 | 0.9179 |
| 0.0849 | 3.9301 | 1800 | 0.2108 | 0.4592 | 0.9183 | 0.9183 | nan | 0.9183 | 0.0 | 0.9183 |
| 0.0917 | 4.0393 | 1850 | 0.1940 | 0.4668 | 0.9336 | 0.9336 | nan | 0.9336 | 0.0 | 0.9336 |
| 0.0793 | 4.1485 | 1900 | 0.1795 | 0.4649 | 0.9298 | 0.9298 | nan | 0.9298 | 0.0 | 0.9298 |
| 0.0851 | 4.2576 | 1950 | 0.2118 | 0.4462 | 0.8924 | 0.8924 | nan | 0.8924 | 0.0 | 0.8924 |
| 0.0951 | 4.3668 | 2000 | 0.2864 | 0.4212 | 0.8424 | 0.8424 | nan | 0.8424 | 0.0 | 0.8424 |
| 0.0805 | 4.4760 | 2050 | 0.1498 | 0.4683 | 0.9366 | 0.9366 | nan | 0.9366 | 0.0 | 0.9366 |
| 0.085 | 4.5852 | 2100 | 0.2223 | 0.4514 | 0.9028 | 0.9028 | nan | 0.9028 | 0.0 | 0.9028 |
| 0.0736 | 4.6943 | 2150 | 0.1860 | 0.4695 | 0.9390 | 0.9390 | nan | 0.9390 | 0.0 | 0.9390 |
| 0.079 | 4.8035 | 2200 | 0.2069 | 0.4653 | 0.9305 | 0.9305 | nan | 0.9305 | 0.0 | 0.9305 |
| 0.0701 | 4.9127 | 2250 | 0.1728 | 0.4724 | 0.9448 | 0.9448 | nan | 0.9448 | 0.0 | 0.9448 |
| 0.0994 | 5.0218 | 2300 | 0.2480 | 0.4602 | 0.9204 | 0.9204 | nan | 0.9204 | 0.0 | 0.9204 |
| 0.0749 | 5.1310 | 2350 | 0.1951 | 0.4663 | 0.9325 | 0.9325 | nan | 0.9325 | 0.0 | 0.9325 |
| 0.0691 | 5.2402 | 2400 | 0.2103 | 0.4568 | 0.9136 | 0.9136 | nan | 0.9136 | 0.0 | 0.9136 |
| 0.0653 | 5.3493 | 2450 | 0.1794 | 0.4570 | 0.9140 | 0.9140 | nan | 0.9140 | 0.0 | 0.9140 |
| 0.0621 | 5.4585 | 2500 | 0.1971 | 0.4715 | 0.9430 | 0.9430 | nan | 0.9430 | 0.0 | 0.9430 |
| 0.073 | 5.5677 | 2550 | 0.1905 | 0.4589 | 0.9179 | 0.9179 | nan | 0.9179 | 0.0 | 0.9179 |
| 0.0658 | 5.6769 | 2600 | 0.2289 | 0.4791 | 0.9581 | 0.9581 | nan | 0.9581 | 0.0 | 0.9581 |
| 0.0727 | 5.7860 | 2650 | 0.1976 | 0.4769 | 0.9539 | 0.9539 | nan | 0.9539 | 0.0 | 0.9539 |
| 0.0756 | 5.8952 | 2700 | 0.1724 | 0.4687 | 0.9373 | 0.9373 | nan | 0.9373 | 0.0 | 0.9373 |
| 0.0756 | 6.0044 | 2750 | 0.1867 | 0.4566 | 0.9133 | 0.9133 | nan | 0.9133 | 0.0 | 0.9133 |
| 0.0695 | 6.1135 | 2800 | 0.1944 | 0.4715 | 0.9430 | 0.9430 | nan | 0.9430 | 0.0 | 0.9430 |
| 0.0683 | 6.2227 | 2850 | 0.2176 | 0.4744 | 0.9488 | 0.9488 | nan | 0.9488 | 0.0 | 0.9488 |
| 0.061 | 6.3319 | 2900 | 0.1959 | 0.4663 | 0.9326 | 0.9326 | nan | 0.9326 | 0.0 | 0.9326 |
| 0.06 | 6.4410 | 2950 | 0.2090 | 0.4615 | 0.9230 | 0.9230 | nan | 0.9230 | 0.0 | 0.9230 |
| 0.0537 | 6.5502 | 3000 | 0.2119 | 0.4735 | 0.9469 | 0.9469 | nan | 0.9469 | 0.0 | 0.9469 |
| 0.0529 | 6.6594 | 3050 | 0.2043 | 0.4568 | 0.9136 | 0.9136 | nan | 0.9136 | 0.0 | 0.9136 |
| 0.08 | 6.7686 | 3100 | 0.2130 | 0.4566 | 0.9132 | 0.9132 | nan | 0.9132 | 0.0 | 0.9132 |
| 0.0632 | 6.8777 | 3150 | 0.1993 | 0.4692 | 0.9384 | 0.9384 | nan | 0.9384 | 0.0 | 0.9384 |
| 0.0641 | 6.9869 | 3200 | 0.2408 | 0.4454 | 0.8909 | 0.8909 | nan | 0.8909 | 0.0 | 0.8909 |
| 0.0517 | 7.0961 | 3250 | 0.1836 | 0.4770 | 0.9540 | 0.9540 | nan | 0.9540 | 0.0 | 0.9540 |
| 0.0584 | 7.2052 | 3300 | 0.1983 | 0.4643 | 0.9285 | 0.9285 | nan | 0.9285 | 0.0 | 0.9285 |
| 0.0559 | 7.3144 | 3350 | 0.2036 | 0.4609 | 0.9217 | 0.9217 | nan | 0.9217 | 0.0 | 0.9217 |
| 0.0621 | 7.4236 | 3400 | 0.2058 | 0.4764 | 0.9528 | 0.9528 | nan | 0.9528 | 0.0 | 0.9528 |
| 0.0641 | 7.5328 | 3450 | 0.2136 | 0.4657 | 0.9314 | 0.9314 | nan | 0.9314 | 0.0 | 0.9314 |
| 0.0481 | 7.6419 | 3500 | 0.1938 | 0.4699 | 0.9398 | 0.9398 | nan | 0.9398 | 0.0 | 0.9398 |
| 0.061 | 7.7511 | 3550 | 0.1979 | 0.4772 | 0.9545 | 0.9545 | nan | 0.9545 | 0.0 | 0.9545 |
| 0.0561 | 7.8603 | 3600 | 0.2271 | 0.4691 | 0.9382 | 0.9382 | nan | 0.9382 | 0.0 | 0.9382 |
| 0.0629 | 7.9694 | 3650 | 0.2220 | 0.4596 | 0.9192 | 0.9192 | nan | 0.9192 | 0.0 | 0.9192 |
| 0.0625 | 8.0786 | 3700 | 0.2422 | 0.4547 | 0.9094 | 0.9094 | nan | 0.9094 | 0.0 | 0.9094 |
| 0.0479 | 8.1878 | 3750 | 0.2360 | 0.4791 | 0.9581 | 0.9581 | nan | 0.9581 | 0.0 | 0.9581 |
| 0.0471 | 8.2969 | 3800 | 0.1981 | 0.4713 | 0.9427 | 0.9427 | nan | 0.9427 | 0.0 | 0.9427 |
| 0.0612 | 8.4061 | 3850 | 0.2427 | 0.4740 | 0.9479 | 0.9479 | nan | 0.9479 | 0.0 | 0.9479 |
| 0.0526 | 8.5153 | 3900 | 0.2516 | 0.4716 | 0.9432 | 0.9432 | nan | 0.9432 | 0.0 | 0.9432 |
| 0.0573 | 8.6245 | 3950 | 0.2240 | 0.4663 | 0.9325 | 0.9325 | nan | 0.9325 | 0.0 | 0.9325 |
| 0.0532 | 8.7336 | 4000 | 0.2539 | 0.4830 | 0.9659 | 0.9659 | nan | 0.9659 | 0.0 | 0.9659 |
| 0.0537 | 8.8428 | 4050 | 0.2202 | 0.4633 | 0.9267 | 0.9267 | nan | 0.9267 | 0.0 | 0.9267 |
| 0.0481 | 8.9520 | 4100 | 0.2155 | 0.4617 | 0.9234 | 0.9234 | nan | 0.9234 | 0.0 | 0.9234 |
| 0.0461 | 9.0611 | 4150 | 0.2217 | 0.4590 | 0.9181 | 0.9181 | nan | 0.9181 | 0.0 | 0.9181 |
| 0.0486 | 9.1703 | 4200 | 0.2748 | 0.4420 | 0.8841 | 0.8841 | nan | 0.8841 | 0.0 | 0.8841 |
| 0.0485 | 9.2795 | 4250 | 0.2172 | 0.4680 | 0.9360 | 0.9360 | nan | 0.9360 | 0.0 | 0.9360 |
| 0.0559 | 9.3886 | 4300 | 0.2285 | 0.4717 | 0.9434 | 0.9434 | nan | 0.9434 | 0.0 | 0.9434 |
| 0.0434 | 9.4978 | 4350 | 0.2288 | 0.4749 | 0.9498 | 0.9498 | nan | 0.9498 | 0.0 | 0.9498 |
| 0.0522 | 9.6070 | 4400 | 0.2420 | 0.4609 | 0.9218 | 0.9218 | nan | 0.9218 | 0.0 | 0.9218 |
| 0.0453 | 9.7162 | 4450 | 0.2370 | 0.4741 | 0.9481 | 0.9481 | nan | 0.9481 | 0.0 | 0.9481 |
| 0.0538 | 9.8253 | 4500 | 0.2464 | 0.4565 | 0.9130 | 0.9130 | nan | 0.9130 | 0.0 | 0.9130 |
| 0.0513 | 9.9345 | 4550 | 0.2364 | 0.4754 | 0.9509 | 0.9509 | nan | 0.9509 | 0.0 | 0.9509 |
### Framework versions
- Transformers 4.50.3
- Pytorch 2.6.0+cu118
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"background",
"crop"
] |
Godouche/segformer-b0-finetuned-segments-sidewalk-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-segments-sidewalk-2
This model is a fine-tuned version of [pretrained_model_name](https://huggingface.co/pretrained_model_name) on the segments/sidewalk-semantic dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9318
- Mean Iou: 0.1574
- Mean Accuracy: 0.1994
- Overall Accuracy: 0.7668
- Accuracy Unlabeled: nan
- Accuracy Flat-road: 0.8689
- Accuracy Flat-sidewalk: 0.9484
- Accuracy Flat-crosswalk: 0.0
- Accuracy Flat-cyclinglane: 0.5015
- Accuracy Flat-parkingdriveway: 0.0118
- Accuracy Flat-railtrack: 0.0
- Accuracy Flat-curb: 0.0
- Accuracy Human-person: 0.0
- Accuracy Human-rider: 0.0
- Accuracy Vehicle-car: 0.8879
- Accuracy Vehicle-truck: 0.0
- Accuracy Vehicle-bus: 0.0
- Accuracy Vehicle-tramtrain: 0.0
- Accuracy Vehicle-motorcycle: 0.0
- Accuracy Vehicle-bicycle: 0.0
- Accuracy Vehicle-caravan: 0.0
- Accuracy Vehicle-cartrailer: 0.0
- Accuracy Construction-building: 0.8895
- Accuracy Construction-door: 0.0
- Accuracy Construction-wall: 0.0
- Accuracy Construction-fenceguardrail: 0.0
- Accuracy Construction-bridge: 0.0
- Accuracy Construction-tunnel: 0.0
- Accuracy Construction-stairs: 0.0
- Accuracy Object-pole: 0.0
- Accuracy Object-trafficsign: 0.0
- Accuracy Object-trafficlight: 0.0
- Accuracy Nature-vegetation: 0.9372
- Accuracy Nature-terrain: 0.8070
- Accuracy Sky: 0.9258
- Accuracy Void-ground: 0.0
- Accuracy Void-dynamic: 0.0
- Accuracy Void-static: 0.0
- Accuracy Void-unclear: 0.0
- Iou Unlabeled: nan
- Iou Flat-road: 0.5983
- Iou Flat-sidewalk: 0.7913
- Iou Flat-crosswalk: 0.0
- Iou Flat-cyclinglane: 0.4760
- Iou Flat-parkingdriveway: 0.0117
- Iou Flat-railtrack: 0.0
- Iou Flat-curb: 0.0
- Iou Human-person: 0.0
- Iou Human-rider: 0.0
- Iou Vehicle-car: 0.6842
- Iou Vehicle-truck: 0.0
- Iou Vehicle-bus: 0.0
- Iou Vehicle-tramtrain: 0.0
- Iou Vehicle-motorcycle: 0.0
- Iou Vehicle-bicycle: 0.0
- Iou Vehicle-caravan: 0.0
- Iou Vehicle-cartrailer: 0.0
- Iou Construction-building: 0.5394
- Iou Construction-door: 0.0
- Iou Construction-wall: 0.0
- Iou Construction-fenceguardrail: 0.0
- Iou Construction-bridge: 0.0
- Iou Construction-tunnel: 0.0
- Iou Construction-stairs: 0.0
- Iou Object-pole: 0.0
- Iou Object-trafficsign: 0.0
- Iou Object-trafficlight: 0.0
- Iou Nature-vegetation: 0.7925
- Iou Nature-terrain: 0.6357
- Iou Sky: 0.8218
- Iou Void-ground: 0.0
- Iou Void-dynamic: 0.0
- Iou Void-static: 0.0
- Iou Void-unclear: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:|
| 3.0121 | 0.05 | 20 | 3.2051 | 0.0715 | 0.1161 | 0.5499 | nan | 0.2271 | 0.8331 | 0.0 | 0.0034 | 0.0152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8031 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9122 | 0.0579 | 0.3930 | 0.0 | 0.0 | 0.0000 | 0.0 | 0.0 | 0.1879 | 0.5784 | 0.0 | 0.0034 | 0.0132 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2923 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6196 | 0.0429 | 0.3425 | 0.0 | 0.0 | 0.0000 | 0.0 |
| 2.4574 | 0.1 | 40 | 2.3972 | 0.0898 | 0.1343 | 0.5989 | nan | 0.2409 | 0.9266 | 0.0 | 0.0002 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8717 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8739 | 0.3707 | 0.4554 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.1989 | 0.5723 | 0.0 | 0.0002 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4603 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4150 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6861 | 0.2795 | 0.4398 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2526 | 0.15 | 60 | 2.1035 | 0.1068 | 0.1532 | 0.6450 | nan | 0.5346 | 0.9003 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9055 | 0.4196 | 0.7392 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.3832 | 0.6253 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4631 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4563 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7142 | 0.3137 | 0.6760 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0063 | 0.2 | 80 | 1.9647 | 0.1103 | 0.1557 | 0.6617 | nan | 0.6366 | 0.9038 | 0.0 | 0.0004 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7976 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8720 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9323 | 0.3499 | 0.8013 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4323 | 0.6691 | 0.0 | 0.0004 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5189 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4518 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7048 | 0.2656 | 0.7066 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8506 | 0.25 | 100 | 1.7593 | 0.1193 | 0.1664 | 0.6793 | nan | 0.7330 | 0.8892 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8105 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9473 | 0.5456 | 0.8795 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4502 | 0.6776 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4805 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7323 | 0.4208 | 0.7421 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.5607 | 0.3 | 120 | 1.6274 | 0.1239 | 0.1706 | 0.6869 | nan | 0.7258 | 0.8926 | 0.0 | 0.0207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8619 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8674 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9326 | 0.6165 | 0.8834 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4536 | 0.6895 | 0.0 | 0.0207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5639 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4743 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7454 | 0.4945 | 0.7712 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.306 | 0.35 | 140 | 1.5786 | 0.1247 | 0.1716 | 0.6887 | nan | 0.7598 | 0.8869 | 0.0 | 0.0086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8176 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8673 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9306 | 0.6645 | 0.8976 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4370 | 0.7152 | 0.0 | 0.0086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5606 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4595 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7578 | 0.5428 | 0.7580 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.575 | 0.4 | 160 | 1.5082 | 0.1288 | 0.1752 | 0.6996 | nan | 0.7417 | 0.9080 | 0.0 | 0.0709 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8809 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9249 | 0.7570 | 0.8812 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5026 | 0.6995 | 0.0 | 0.0708 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5966 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4679 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7603 | 0.4979 | 0.7831 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6042 | 0.45 | 180 | 1.4727 | 0.1287 | 0.1787 | 0.6995 | nan | 0.8500 | 0.8738 | 0.0 | 0.0384 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8816 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9079 | 0.7637 | 0.9126 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4520 | 0.7393 | 0.0 | 0.0384 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5698 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4946 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7624 | 0.5407 | 0.7774 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.4266 | 0.5 | 200 | 1.3875 | 0.1306 | 0.1762 | 0.7031 | nan | 0.7264 | 0.9328 | 0.0 | 0.0343 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8636 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9016 | 0.7596 | 0.8893 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5032 | 0.6913 | 0.0 | 0.0343 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7766 | 0.5517 | 0.7924 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1122 | 0.55 | 220 | 1.3270 | 0.1328 | 0.1805 | 0.7105 | nan | 0.8744 | 0.8741 | 0.0 | 0.1523 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8563 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7965 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9594 | 0.7025 | 0.9218 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4650 | 0.7567 | 0.0 | 0.1523 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6124 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5071 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7412 | 0.5084 | 0.7720 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3171 | 0.6 | 240 | 1.2920 | 0.1391 | 0.1886 | 0.7254 | nan | 0.7889 | 0.9304 | 0.0 | 0.3131 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8537 | 0.8551 | 0.9024 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5584 | 0.7367 | 0.0 | 0.3089 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5828 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7534 | 0.5025 | 0.7871 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7291 | 0.65 | 260 | 1.2315 | 0.1451 | 0.1894 | 0.7364 | nan | 0.8327 | 0.9268 | 0.0 | 0.3435 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8883 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9039 | 0.8075 | 0.8963 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5462 | 0.7501 | 0.0 | 0.3407 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4953 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7818 | 0.5866 | 0.7929 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.4373 | 0.7 | 280 | 1.1735 | 0.1435 | 0.1888 | 0.7379 | nan | 0.8262 | 0.9417 | 0.0 | 0.3482 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8520 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9055 | 0.7118 | 0.9042 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5570 | 0.7486 | 0.0 | 0.3453 | 0.0000 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5719 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7796 | 0.5634 | 0.7926 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3842 | 0.75 | 300 | 1.1702 | 0.1466 | 0.1923 | 0.7440 | nan | 0.8228 | 0.9429 | 0.0 | 0.4033 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8944 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8953 | 0.8006 | 0.9032 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5734 | 0.7599 | 0.0 | 0.3954 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7754 | 0.5717 | 0.7900 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8302 | 0.8 | 320 | 1.1554 | 0.1474 | 0.1912 | 0.7443 | nan | 0.8402 | 0.9093 | 0.0 | 0.4951 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8403 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8897 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9543 | 0.6583 | 0.9150 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5629 | 0.7798 | 0.0 | 0.4551 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6264 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4946 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7540 | 0.5530 | 0.7861 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.0734 | 0.85 | 340 | 1.1194 | 0.1497 | 0.1952 | 0.7499 | nan | 0.8367 | 0.9363 | 0.0 | 0.4375 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9168 | 0.8314 | 0.9208 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5623 | 0.7669 | 0.0 | 0.4207 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6308 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7792 | 0.5933 | 0.7986 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1957 | 0.9 | 360 | 1.1065 | 0.1488 | 0.1951 | 0.7433 | nan | 0.8849 | 0.8898 | 0.0 | 0.4485 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8774 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9276 | 0.7874 | 0.9128 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5175 | 0.7792 | 0.0 | 0.4225 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7878 | 0.6150 | 0.8002 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.0868 | 0.95 | 380 | 1.0887 | 0.1505 | 0.1912 | 0.7471 | nan | 0.7503 | 0.9557 | 0.0 | 0.4570 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8680 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9330 | 0.7527 | 0.8823 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5592 | 0.7452 | 0.0 | 0.4208 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6647 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5208 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7882 | 0.6068 | 0.8093 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6125 | 1.0 | 400 | 1.0484 | 0.1512 | 0.1960 | 0.7522 | nan | 0.8825 | 0.9225 | 0.0 | 0.4567 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8529 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9259 | 0.7985 | 0.9192 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5386 | 0.7864 | 0.0 | 0.4375 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7843 | 0.6077 | 0.8088 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3332 | 1.05 | 420 | 1.0294 | 0.1523 | 0.1942 | 0.7545 | nan | 0.8265 | 0.9434 | 0.0 | 0.4805 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8711 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8838 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9403 | 0.7424 | 0.9143 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5748 | 0.7724 | 0.0 | 0.4583 | 0.0010 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6613 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5187 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7815 | 0.6039 | 0.8060 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.5885 | 1.1 | 440 | 1.0505 | 0.1499 | 0.1920 | 0.7495 | nan | 0.7691 | 0.9622 | 0.0 | 0.4316 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8800 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8616 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9372 | 0.7558 | 0.9277 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6003 | 0.7422 | 0.0 | 0.4224 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6559 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7671 | 0.5738 | 0.8056 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1033 | 1.15 | 460 | 1.0482 | 0.1515 | 0.1969 | 0.7561 | nan | 0.8663 | 0.9229 | 0.0 | 0.5067 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8891 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9342 | 0.7461 | 0.9213 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5763 | 0.7942 | 0.0 | 0.4699 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7818 | 0.5704 | 0.8047 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.2376 | 1.2 | 480 | 0.9907 | 0.1534 | 0.1969 | 0.7586 | nan | 0.8626 | 0.9393 | 0.0 | 0.4765 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9293 | 0.7922 | 0.9122 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5706 | 0.7866 | 0.0 | 0.4577 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6607 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7847 | 0.6058 | 0.8099 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.0037 | 1.25 | 500 | 1.0006 | 0.1525 | 0.1966 | 0.7568 | nan | 0.8847 | 0.9340 | 0.0 | 0.4579 | 0.0030 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8794 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9192 | 0.7929 | 0.9262 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5564 | 0.7956 | 0.0 | 0.4466 | 0.0030 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5336 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7866 | 0.5935 | 0.8109 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8961 | 1.3 | 520 | 0.9965 | 0.1544 | 0.1962 | 0.7588 | nan | 0.8673 | 0.9513 | 0.0 | 0.4645 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8450 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9055 | 0.8151 | 0.9210 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5845 | 0.7805 | 0.0 | 0.4501 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6823 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7941 | 0.6295 | 0.8089 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.2659 | 1.35 | 540 | 0.9665 | 0.1559 | 0.2002 | 0.7642 | nan | 0.8463 | 0.9409 | 0.0 | 0.5501 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8996 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8849 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9281 | 0.8278 | 0.9255 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5940 | 0.7939 | 0.0 | 0.5013 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5310 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7871 | 0.6206 | 0.8107 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.0982 | 1.4 | 560 | 0.9850 | 0.1539 | 0.1970 | 0.7600 | nan | 0.8854 | 0.9320 | 0.0 | 0.4731 | 0.0035 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8859 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9415 | 0.7702 | 0.9236 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5686 | 0.7967 | 0.0 | 0.4556 | 0.0035 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5330 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7864 | 0.6114 | 0.8084 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.9199 | 1.45 | 580 | 0.9775 | 0.1562 | 0.1975 | 0.7638 | nan | 0.8526 | 0.9460 | 0.0 | 0.5063 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8880 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9058 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9427 | 0.7703 | 0.8906 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5977 | 0.7834 | 0.0 | 0.4762 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6786 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7931 | 0.6229 | 0.8139 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.18 | 1.5 | 600 | 0.9614 | 0.1566 | 0.1999 | 0.7651 | nan | 0.8643 | 0.9444 | 0.0 | 0.5169 | 0.0076 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8853 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9254 | 0.8260 | 0.9211 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5910 | 0.7880 | 0.0 | 0.4789 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6639 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5430 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7985 | 0.6359 | 0.8160 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.0018 | 1.55 | 620 | 0.9561 | 0.1559 | 0.1999 | 0.7646 | nan | 0.8482 | 0.9371 | 0.0 | 0.5618 | 0.0125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8892 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9396 | 0.7710 | 0.9336 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5921 | 0.7961 | 0.0 | 0.4994 | 0.0124 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6519 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5342 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7899 | 0.6146 | 0.8114 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.2407 | 1.6 | 640 | 0.9485 | 0.1569 | 0.1990 | 0.7655 | nan | 0.8720 | 0.9414 | 0.0 | 0.5204 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8930 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9382 | 0.7939 | 0.9093 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5839 | 0.7950 | 0.0 | 0.4856 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7942 | 0.6388 | 0.8179 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9654 | 1.65 | 660 | 0.9485 | 0.1573 | 0.1998 | 0.7642 | nan | 0.8749 | 0.9361 | 0.0 | 0.5410 | 0.0040 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8645 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9020 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9214 | 0.8284 | 0.9216 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5784 | 0.7939 | 0.0 | 0.4909 | 0.0039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5320 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7975 | 0.6365 | 0.8209 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.9202 | 1.7 | 680 | 0.9327 | 0.1570 | 0.1996 | 0.7649 | nan | 0.8746 | 0.9393 | 0.0 | 0.5136 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8742 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9291 | 0.8316 | 0.9163 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5860 | 0.7933 | 0.0 | 0.4807 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6877 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5350 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7966 | 0.6295 | 0.8220 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.5064 | 1.75 | 700 | 0.9305 | 0.1557 | 0.1987 | 0.7650 | nan | 0.8746 | 0.9420 | 0.0 | 0.5082 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9001 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8607 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9499 | 0.7841 | 0.9311 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5905 | 0.7950 | 0.0 | 0.4787 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6616 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7777 | 0.6234 | 0.8155 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.9806 | 1.8 | 720 | 0.9289 | 0.1565 | 0.2001 | 0.7657 | nan | 0.8747 | 0.9412 | 0.0 | 0.5171 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8950 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8818 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9310 | 0.8209 | 0.9366 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5897 | 0.7943 | 0.0 | 0.4820 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6716 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5422 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7941 | 0.6275 | 0.8150 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3995 | 1.85 | 740 | 0.9427 | 0.1569 | 0.1997 | 0.7655 | nan | 0.8686 | 0.9454 | 0.0 | 0.5040 | 0.0111 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9074 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9210 | 0.8175 | 0.9246 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5975 | 0.7928 | 0.0 | 0.4749 | 0.0110 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6760 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5285 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8008 | 0.6342 | 0.8179 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8194 | 1.9 | 760 | 0.9331 | 0.1567 | 0.2001 | 0.7654 | nan | 0.8728 | 0.9448 | 0.0 | 0.4979 | 0.0122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9180 | 0.8365 | 0.9265 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5943 | 0.7930 | 0.0 | 0.4715 | 0.0121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6821 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7996 | 0.6210 | 0.8210 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8659 | 1.95 | 780 | 0.9291 | 0.1571 | 0.1998 | 0.7669 | nan | 0.8656 | 0.9484 | 0.0 | 0.5035 | 0.0142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8874 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9352 | 0.8104 | 0.9272 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.6015 | 0.7913 | 0.0 | 0.4774 | 0.0140 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6710 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7927 | 0.6314 | 0.8208 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.878 | 2.0 | 800 | 0.9318 | 0.1574 | 0.1994 | 0.7668 | nan | 0.8689 | 0.9484 | 0.0 | 0.5015 | 0.0118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8879 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8895 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9372 | 0.8070 | 0.9258 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5983 | 0.7913 | 0.0 | 0.4760 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6842 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7925 | 0.6357 | 0.8218 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.50.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"unlabeled",
"flat-road",
"flat-sidewalk",
"flat-crosswalk",
"flat-cyclinglane",
"flat-parkingdriveway",
"flat-railtrack",
"flat-curb",
"human-person",
"human-rider",
"vehicle-car",
"vehicle-truck",
"vehicle-bus",
"vehicle-tramtrain",
"vehicle-motorcycle",
"vehicle-bicycle",
"vehicle-caravan",
"vehicle-cartrailer",
"construction-building",
"construction-door",
"construction-wall",
"construction-fenceguardrail",
"construction-bridge",
"construction-tunnel",
"construction-stairs",
"object-pole",
"object-trafficsign",
"object-trafficlight",
"nature-vegetation",
"nature-terrain",
"sky",
"void-ground",
"void-dynamic",
"void-static",
"void-unclear"
] |
mujerry/segformer-b0-finetuned-ade-512-512_necrosis |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-ade-512-512_necrosis
This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0608
- Mean Iou: 0.8722
- Mean Accuracy: 0.9242
- Overall Accuracy: 0.9813
- Accuracy Background: 0.9949
- Accuracy Necrosis: 0.8211
- Accuracy Root: 0.9564
- Iou Background: 0.9895
- Iou Necrosis: 0.7138
- Iou Root: 0.9132
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Necrosis | Accuracy Root | Iou Background | Iou Necrosis | Iou Root |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------:|:-------------:|:--------------:|:------------:|:--------:|
| 0.8601 | 0.625 | 20 | 0.8393 | 0.5508 | 0.6494 | 0.8874 | 0.9288 | 0.1647 | 0.8546 | 0.9164 | 0.0706 | 0.6654 |
| 0.6009 | 1.25 | 40 | 0.5086 | 0.5870 | 0.6565 | 0.9438 | 0.9765 | 0.0084 | 0.9847 | 0.9727 | 0.0078 | 0.7805 |
| 0.4497 | 1.875 | 60 | 0.3599 | 0.5953 | 0.6654 | 0.9438 | 0.9750 | 0.0350 | 0.9862 | 0.9698 | 0.0333 | 0.7826 |
| 0.3922 | 2.5 | 80 | 0.2861 | 0.6267 | 0.6957 | 0.9469 | 0.9752 | 0.1275 | 0.9846 | 0.9717 | 0.1183 | 0.7902 |
| 0.2496 | 3.125 | 100 | 0.2361 | 0.7322 | 0.7918 | 0.9622 | 0.9846 | 0.4183 | 0.9726 | 0.9804 | 0.3765 | 0.8398 |
| 0.2184 | 3.75 | 120 | 0.1989 | 0.7824 | 0.8508 | 0.9664 | 0.9840 | 0.6063 | 0.9621 | 0.9805 | 0.5121 | 0.8546 |
| 0.2193 | 4.375 | 140 | 0.1700 | 0.8123 | 0.8765 | 0.9721 | 0.9900 | 0.6864 | 0.9529 | 0.9849 | 0.5768 | 0.8752 |
| 0.1705 | 5.0 | 160 | 0.1500 | 0.8121 | 0.8731 | 0.9723 | 0.9889 | 0.6684 | 0.9621 | 0.9851 | 0.5749 | 0.8763 |
| 0.1611 | 5.625 | 180 | 0.1420 | 0.8381 | 0.9065 | 0.9753 | 0.9942 | 0.7919 | 0.9333 | 0.9863 | 0.6416 | 0.8863 |
| 0.128 | 6.25 | 200 | 0.1293 | 0.8420 | 0.9101 | 0.9763 | 0.9938 | 0.7972 | 0.9393 | 0.9873 | 0.6473 | 0.8914 |
| 0.1368 | 6.875 | 220 | 0.1115 | 0.8385 | 0.8990 | 0.9763 | 0.9914 | 0.7476 | 0.9581 | 0.9874 | 0.6362 | 0.8920 |
| 0.1459 | 7.5 | 240 | 0.1074 | 0.8411 | 0.8985 | 0.9771 | 0.9929 | 0.7457 | 0.9568 | 0.9881 | 0.6397 | 0.8955 |
| 0.1066 | 8.125 | 260 | 0.1026 | 0.8505 | 0.9127 | 0.9776 | 0.9947 | 0.8020 | 0.9415 | 0.9877 | 0.6676 | 0.8963 |
| 0.0973 | 8.75 | 280 | 0.0959 | 0.8558 | 0.9189 | 0.9787 | 0.9931 | 0.8118 | 0.9517 | 0.9885 | 0.6769 | 0.9020 |
| 0.1286 | 9.375 | 300 | 0.0883 | 0.8544 | 0.9024 | 0.9792 | 0.9944 | 0.7529 | 0.9598 | 0.9885 | 0.6704 | 0.9043 |
| 0.0824 | 10.0 | 320 | 0.0872 | 0.8614 | 0.9190 | 0.9796 | 0.9934 | 0.8078 | 0.9559 | 0.9887 | 0.6896 | 0.9058 |
| 0.083 | 10.625 | 340 | 0.0868 | 0.8641 | 0.9205 | 0.9796 | 0.9955 | 0.8207 | 0.9453 | 0.9882 | 0.6990 | 0.9051 |
| 0.0794 | 11.25 | 360 | 0.0816 | 0.8612 | 0.9198 | 0.9796 | 0.9943 | 0.8142 | 0.9510 | 0.9889 | 0.6893 | 0.9054 |
| 0.0979 | 11.875 | 380 | 0.0816 | 0.8575 | 0.9062 | 0.9796 | 0.9929 | 0.7582 | 0.9675 | 0.9888 | 0.6770 | 0.9066 |
| 0.0734 | 12.5 | 400 | 0.0785 | 0.8584 | 0.9033 | 0.9799 | 0.9949 | 0.7537 | 0.9612 | 0.9889 | 0.6790 | 0.9073 |
| 0.108 | 13.125 | 420 | 0.0749 | 0.8642 | 0.9161 | 0.9802 | 0.9949 | 0.7983 | 0.9551 | 0.9889 | 0.6954 | 0.9084 |
| 0.0803 | 13.75 | 440 | 0.0758 | 0.8691 | 0.9265 | 0.9804 | 0.9949 | 0.8359 | 0.9488 | 0.9887 | 0.7099 | 0.9086 |
| 0.0812 | 14.375 | 460 | 0.0734 | 0.8683 | 0.9235 | 0.9805 | 0.9949 | 0.8238 | 0.9517 | 0.9889 | 0.7067 | 0.9094 |
| 0.0715 | 15.0 | 480 | 0.0696 | 0.8683 | 0.9239 | 0.9806 | 0.9931 | 0.8180 | 0.9605 | 0.9892 | 0.7054 | 0.9104 |
| 0.0673 | 15.625 | 500 | 0.0675 | 0.8698 | 0.9275 | 0.9808 | 0.9938 | 0.8328 | 0.9560 | 0.9893 | 0.7091 | 0.9109 |
| 0.072 | 16.25 | 520 | 0.0696 | 0.8699 | 0.9231 | 0.9809 | 0.9948 | 0.8195 | 0.9550 | 0.9892 | 0.7094 | 0.9112 |
| 0.0681 | 16.875 | 540 | 0.0696 | 0.8696 | 0.9235 | 0.9806 | 0.9955 | 0.8255 | 0.9496 | 0.9889 | 0.7105 | 0.9096 |
| 0.0641 | 17.5 | 560 | 0.0671 | 0.8618 | 0.9063 | 0.9805 | 0.9944 | 0.7587 | 0.9657 | 0.9894 | 0.6860 | 0.9101 |
| 0.0842 | 18.125 | 580 | 0.0681 | 0.8692 | 0.9211 | 0.9808 | 0.9948 | 0.8128 | 0.9558 | 0.9892 | 0.7073 | 0.9111 |
| 0.0738 | 18.75 | 600 | 0.0661 | 0.8693 | 0.9214 | 0.9809 | 0.9942 | 0.8109 | 0.9591 | 0.9893 | 0.7070 | 0.9116 |
| 0.0629 | 19.375 | 620 | 0.0640 | 0.8685 | 0.9177 | 0.9810 | 0.9937 | 0.7946 | 0.9648 | 0.9895 | 0.7037 | 0.9122 |
| 0.064 | 20.0 | 640 | 0.0637 | 0.8705 | 0.9238 | 0.9811 | 0.9936 | 0.8162 | 0.9616 | 0.9896 | 0.7093 | 0.9128 |
| 0.0599 | 20.625 | 660 | 0.0638 | 0.8704 | 0.9221 | 0.9811 | 0.9950 | 0.8153 | 0.9561 | 0.9894 | 0.7098 | 0.9121 |
| 0.0645 | 21.25 | 680 | 0.0644 | 0.8715 | 0.9257 | 0.9811 | 0.9939 | 0.8243 | 0.9588 | 0.9894 | 0.7126 | 0.9126 |
| 0.0843 | 21.875 | 700 | 0.0643 | 0.8670 | 0.9131 | 0.9810 | 0.9949 | 0.7827 | 0.9619 | 0.9895 | 0.6995 | 0.9119 |
| 0.0578 | 22.5 | 720 | 0.0629 | 0.8716 | 0.9255 | 0.9809 | 0.9958 | 0.8319 | 0.9486 | 0.9890 | 0.7151 | 0.9107 |
| 0.0586 | 23.125 | 740 | 0.0616 | 0.8681 | 0.9178 | 0.9810 | 0.9937 | 0.7949 | 0.9647 | 0.9896 | 0.7023 | 0.9123 |
| 0.0678 | 23.75 | 760 | 0.0614 | 0.8732 | 0.9318 | 0.9812 | 0.9944 | 0.8481 | 0.9528 | 0.9895 | 0.7176 | 0.9124 |
| 0.0757 | 24.375 | 780 | 0.0627 | 0.8680 | 0.9151 | 0.9811 | 0.9949 | 0.7891 | 0.9613 | 0.9896 | 0.7019 | 0.9125 |
| 0.081 | 25.0 | 800 | 0.0621 | 0.8721 | 0.9248 | 0.9813 | 0.9950 | 0.8242 | 0.9553 | 0.9895 | 0.7138 | 0.9129 |
| 0.0628 | 25.625 | 820 | 0.0604 | 0.8718 | 0.9239 | 0.9814 | 0.9941 | 0.8173 | 0.9604 | 0.9896 | 0.7121 | 0.9136 |
| 0.0515 | 26.25 | 840 | 0.0612 | 0.8720 | 0.9233 | 0.9813 | 0.9945 | 0.8162 | 0.9591 | 0.9896 | 0.7131 | 0.9134 |
| 0.0735 | 26.875 | 860 | 0.0605 | 0.8719 | 0.9224 | 0.9813 | 0.9953 | 0.8159 | 0.9559 | 0.9895 | 0.7132 | 0.9131 |
| 0.06 | 27.5 | 880 | 0.0610 | 0.8729 | 0.9254 | 0.9814 | 0.9951 | 0.8259 | 0.9551 | 0.9895 | 0.7160 | 0.9133 |
| 0.0525 | 28.125 | 900 | 0.0610 | 0.8716 | 0.9227 | 0.9813 | 0.9946 | 0.8147 | 0.9588 | 0.9896 | 0.7118 | 0.9134 |
| 0.0738 | 28.75 | 920 | 0.0610 | 0.8713 | 0.9217 | 0.9813 | 0.9949 | 0.8120 | 0.9584 | 0.9896 | 0.7111 | 0.9133 |
| 0.0632 | 29.375 | 940 | 0.0606 | 0.8718 | 0.9228 | 0.9813 | 0.9951 | 0.8166 | 0.9566 | 0.9895 | 0.7129 | 0.9131 |
| 0.0547 | 30.0 | 960 | 0.0608 | 0.8722 | 0.9242 | 0.9813 | 0.9949 | 0.8211 | 0.9564 | 0.9895 | 0.7138 | 0.9132 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"necrosis",
"root"
] |
mujerry/segformer-b2-finetuned-ade-512-512_necrosis |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b2-finetuned-ade-512-512_necrosis
This model is a fine-tuned version of [nvidia/segformer-b2-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b2-finetuned-ade-512-512) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0547
- Mean Iou: 0.8851
- Mean Accuracy: 0.9274
- Overall Accuracy: 0.9826
- Accuracy Background: 0.9941
- Accuracy Necrosis: 0.8203
- Accuracy Root: 0.9678
- Iou Background: 0.9889
- Iou Necrosis: 0.7417
- Iou Root: 0.9247
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Necrosis | Accuracy Root | Iou Background | Iou Necrosis | Iou Root |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------:|:-------------:|:--------------:|:------------:|:--------:|
| 1.0136 | 0.3125 | 20 | 0.9745 | 0.2835 | 0.5534 | 0.5117 | 0.5703 | 0.8384 | 0.2516 | 0.5531 | 0.0588 | 0.2387 |
| 0.782 | 0.625 | 40 | 0.6546 | 0.6443 | 0.7573 | 0.9244 | 0.9470 | 0.3958 | 0.9292 | 0.9426 | 0.1808 | 0.8096 |
| 0.5646 | 0.9375 | 60 | 0.5035 | 0.6000 | 0.6673 | 0.9352 | 0.9622 | 0.0591 | 0.9807 | 0.9595 | 0.0417 | 0.7987 |
| 0.4075 | 1.25 | 80 | 0.3676 | 0.6185 | 0.6781 | 0.9491 | 0.9802 | 0.0744 | 0.9797 | 0.9743 | 0.0697 | 0.8114 |
| 0.3336 | 1.5625 | 100 | 0.2976 | 0.6525 | 0.7111 | 0.9526 | 0.9793 | 0.1703 | 0.9838 | 0.9751 | 0.1626 | 0.8198 |
| 0.3046 | 1.875 | 120 | 0.2017 | 0.8358 | 0.9058 | 0.9716 | 0.9905 | 0.7937 | 0.9334 | 0.9798 | 0.6453 | 0.8823 |
| 0.1448 | 2.1875 | 140 | 0.1557 | 0.8383 | 0.9006 | 0.9725 | 0.9850 | 0.7537 | 0.9631 | 0.9798 | 0.6465 | 0.8885 |
| 0.1214 | 2.5 | 160 | 0.1194 | 0.8600 | 0.9089 | 0.9773 | 0.9944 | 0.7847 | 0.9475 | 0.9840 | 0.6915 | 0.9044 |
| 0.1044 | 2.8125 | 180 | 0.1037 | 0.8590 | 0.9012 | 0.9779 | 0.9938 | 0.7523 | 0.9575 | 0.9848 | 0.6852 | 0.9069 |
| 0.0875 | 3.125 | 200 | 0.1002 | 0.8520 | 0.8956 | 0.9769 | 0.9906 | 0.7280 | 0.9681 | 0.9844 | 0.6686 | 0.9031 |
| 0.0873 | 3.4375 | 220 | 0.0873 | 0.8574 | 0.8968 | 0.9781 | 0.9919 | 0.7293 | 0.9693 | 0.9853 | 0.6787 | 0.9083 |
| 0.0823 | 3.75 | 240 | 0.0876 | 0.8712 | 0.9292 | 0.9789 | 0.9944 | 0.8486 | 0.9447 | 0.9857 | 0.7185 | 0.9094 |
| 0.0828 | 4.0625 | 260 | 0.0866 | 0.8657 | 0.9290 | 0.9765 | 0.9934 | 0.8578 | 0.9357 | 0.9827 | 0.7143 | 0.9002 |
| 0.0601 | 4.375 | 280 | 0.0774 | 0.8619 | 0.9002 | 0.9787 | 0.9937 | 0.7430 | 0.9638 | 0.9857 | 0.6901 | 0.9100 |
| 0.0734 | 4.6875 | 300 | 0.0746 | 0.8588 | 0.8964 | 0.9787 | 0.9924 | 0.7261 | 0.9708 | 0.9860 | 0.6798 | 0.9106 |
| 0.1485 | 5.0 | 320 | 0.0693 | 0.8774 | 0.9267 | 0.9804 | 0.9938 | 0.8291 | 0.9571 | 0.9866 | 0.7293 | 0.9164 |
| 0.0592 | 5.3125 | 340 | 0.0681 | 0.8739 | 0.9184 | 0.9800 | 0.9927 | 0.7982 | 0.9644 | 0.9862 | 0.7202 | 0.9153 |
| 0.0599 | 5.625 | 360 | 0.0665 | 0.8753 | 0.9207 | 0.9804 | 0.9925 | 0.8039 | 0.9657 | 0.9866 | 0.7224 | 0.9169 |
| 0.0653 | 5.9375 | 380 | 0.0651 | 0.8774 | 0.9304 | 0.9802 | 0.9946 | 0.8461 | 0.9506 | 0.9863 | 0.7301 | 0.9159 |
| 0.0729 | 6.25 | 400 | 0.0635 | 0.8795 | 0.9241 | 0.9812 | 0.9929 | 0.8125 | 0.9670 | 0.9876 | 0.7311 | 0.9197 |
| 0.0713 | 6.5625 | 420 | 0.0653 | 0.8785 | 0.9273 | 0.9802 | 0.9954 | 0.8376 | 0.9490 | 0.9862 | 0.7346 | 0.9147 |
| 0.0584 | 6.875 | 440 | 0.0619 | 0.8772 | 0.9173 | 0.9807 | 0.9943 | 0.7956 | 0.9619 | 0.9866 | 0.7273 | 0.9177 |
| 0.0515 | 7.1875 | 460 | 0.0629 | 0.8644 | 0.9005 | 0.9799 | 0.9933 | 0.7369 | 0.9714 | 0.9871 | 0.6912 | 0.9148 |
| 0.0423 | 7.5 | 480 | 0.0594 | 0.8809 | 0.9237 | 0.9815 | 0.9938 | 0.8119 | 0.9653 | 0.9877 | 0.7337 | 0.9212 |
| 0.0568 | 7.8125 | 500 | 0.0588 | 0.8822 | 0.9369 | 0.9813 | 0.9925 | 0.8564 | 0.9617 | 0.9877 | 0.7387 | 0.9201 |
| 0.0786 | 8.125 | 520 | 0.0587 | 0.8781 | 0.9178 | 0.9814 | 0.9946 | 0.7945 | 0.9644 | 0.9877 | 0.7260 | 0.9205 |
| 0.0475 | 8.4375 | 540 | 0.0643 | 0.8693 | 0.9098 | 0.9796 | 0.9923 | 0.7688 | 0.9683 | 0.9860 | 0.7081 | 0.9137 |
| 0.0556 | 8.75 | 560 | 0.0571 | 0.8738 | 0.9099 | 0.9812 | 0.9948 | 0.7673 | 0.9677 | 0.9880 | 0.7134 | 0.9199 |
| 0.0511 | 9.0625 | 580 | 0.0574 | 0.8786 | 0.9199 | 0.9814 | 0.9923 | 0.7945 | 0.9729 | 0.9878 | 0.7273 | 0.9207 |
| 0.0392 | 9.375 | 600 | 0.0571 | 0.8713 | 0.9074 | 0.9807 | 0.9936 | 0.7576 | 0.9711 | 0.9876 | 0.7088 | 0.9176 |
| 0.0438 | 9.6875 | 620 | 0.0565 | 0.8823 | 0.9326 | 0.9817 | 0.9949 | 0.8461 | 0.9568 | 0.9882 | 0.7374 | 0.9213 |
| 0.157 | 10.0 | 640 | 0.0564 | 0.8829 | 0.9292 | 0.9815 | 0.9944 | 0.8337 | 0.9594 | 0.9877 | 0.7411 | 0.9200 |
| 0.0404 | 10.3125 | 660 | 0.0571 | 0.8814 | 0.9276 | 0.9811 | 0.9957 | 0.8346 | 0.9526 | 0.9870 | 0.7384 | 0.9188 |
| 0.0447 | 10.625 | 680 | 0.0536 | 0.8814 | 0.9250 | 0.9822 | 0.9933 | 0.8113 | 0.9703 | 0.9888 | 0.7316 | 0.9237 |
| 0.0353 | 10.9375 | 700 | 0.0571 | 0.8774 | 0.9162 | 0.9812 | 0.9934 | 0.7857 | 0.9695 | 0.9875 | 0.7250 | 0.9198 |
| 0.0488 | 11.25 | 720 | 0.0574 | 0.8821 | 0.9344 | 0.9811 | 0.9950 | 0.8563 | 0.9520 | 0.9875 | 0.7401 | 0.9186 |
| 0.0444 | 11.5625 | 740 | 0.0595 | 0.8784 | 0.9224 | 0.9792 | 0.9957 | 0.8262 | 0.9454 | 0.9843 | 0.7406 | 0.9104 |
| 0.0452 | 11.875 | 760 | 0.0553 | 0.8806 | 0.9365 | 0.9811 | 0.9957 | 0.8664 | 0.9474 | 0.9878 | 0.7361 | 0.9180 |
| 0.0375 | 12.1875 | 780 | 0.0533 | 0.8812 | 0.9237 | 0.9818 | 0.9918 | 0.8046 | 0.9748 | 0.9881 | 0.7330 | 0.9224 |
| 0.0364 | 12.5 | 800 | 0.0530 | 0.8842 | 0.9276 | 0.9822 | 0.9936 | 0.8217 | 0.9676 | 0.9884 | 0.7405 | 0.9236 |
| 0.031 | 12.8125 | 820 | 0.0542 | 0.8818 | 0.9268 | 0.9815 | 0.9954 | 0.8280 | 0.9571 | 0.9877 | 0.7371 | 0.9206 |
| 0.0322 | 13.125 | 840 | 0.0533 | 0.8841 | 0.9352 | 0.9820 | 0.9939 | 0.8506 | 0.9611 | 0.9886 | 0.7411 | 0.9226 |
| 0.0343 | 13.4375 | 860 | 0.0543 | 0.8817 | 0.9219 | 0.9820 | 0.9942 | 0.8044 | 0.9672 | 0.9883 | 0.7341 | 0.9225 |
| 0.0368 | 13.75 | 880 | 0.0520 | 0.8848 | 0.9308 | 0.9824 | 0.9942 | 0.8334 | 0.9647 | 0.9889 | 0.7410 | 0.9245 |
| 0.0297 | 14.0625 | 900 | 0.0535 | 0.8825 | 0.9256 | 0.9821 | 0.9923 | 0.8111 | 0.9735 | 0.9885 | 0.7355 | 0.9234 |
| 0.0606 | 14.375 | 920 | 0.0538 | 0.8800 | 0.9188 | 0.9819 | 0.9939 | 0.7926 | 0.9699 | 0.9885 | 0.7289 | 0.9225 |
| 0.0429 | 14.6875 | 940 | 0.0535 | 0.8802 | 0.9188 | 0.9823 | 0.9938 | 0.7902 | 0.9724 | 0.9889 | 0.7276 | 0.9241 |
| 0.0692 | 15.0 | 960 | 0.0565 | 0.8813 | 0.9278 | 0.9812 | 0.9898 | 0.8163 | 0.9772 | 0.9873 | 0.7367 | 0.9200 |
| 0.0359 | 15.3125 | 980 | 0.0535 | 0.8832 | 0.9261 | 0.9820 | 0.9954 | 0.8228 | 0.9600 | 0.9882 | 0.7390 | 0.9224 |
| 0.0282 | 15.625 | 1000 | 0.0529 | 0.8838 | 0.9240 | 0.9821 | 0.9958 | 0.8160 | 0.9603 | 0.9882 | 0.7399 | 0.9231 |
| 0.038 | 15.9375 | 1020 | 0.0535 | 0.8808 | 0.9217 | 0.9812 | 0.9946 | 0.8094 | 0.9612 | 0.9872 | 0.7364 | 0.9189 |
| 0.0355 | 16.25 | 1040 | 0.0536 | 0.8822 | 0.9222 | 0.9824 | 0.9946 | 0.8042 | 0.9677 | 0.9888 | 0.7333 | 0.9244 |
| 0.046 | 16.5625 | 1060 | 0.0540 | 0.8831 | 0.9248 | 0.9820 | 0.9919 | 0.8074 | 0.9752 | 0.9883 | 0.7378 | 0.9231 |
| 0.0346 | 16.875 | 1080 | 0.0514 | 0.8851 | 0.9283 | 0.9824 | 0.9937 | 0.8231 | 0.9680 | 0.9886 | 0.7420 | 0.9247 |
| 0.0355 | 17.1875 | 1100 | 0.0523 | 0.8844 | 0.9272 | 0.9823 | 0.9947 | 0.8226 | 0.9641 | 0.9886 | 0.7404 | 0.9241 |
| 0.0317 | 17.5 | 1120 | 0.0517 | 0.8834 | 0.9229 | 0.9826 | 0.9946 | 0.8055 | 0.9686 | 0.9890 | 0.7358 | 0.9253 |
| 0.0489 | 17.8125 | 1140 | 0.0526 | 0.8823 | 0.9213 | 0.9824 | 0.9939 | 0.7990 | 0.9711 | 0.9889 | 0.7333 | 0.9246 |
| 0.0318 | 18.125 | 1160 | 0.0520 | 0.8864 | 0.9314 | 0.9824 | 0.9951 | 0.8384 | 0.9607 | 0.9886 | 0.7464 | 0.9242 |
| 0.0264 | 18.4375 | 1180 | 0.0518 | 0.8853 | 0.9300 | 0.9823 | 0.9946 | 0.8329 | 0.9626 | 0.9885 | 0.7439 | 0.9235 |
| 0.036 | 18.75 | 1200 | 0.0524 | 0.8821 | 0.9200 | 0.9826 | 0.9947 | 0.7958 | 0.9696 | 0.9890 | 0.7320 | 0.9253 |
| 0.0288 | 19.0625 | 1220 | 0.0540 | 0.8794 | 0.9167 | 0.9821 | 0.9933 | 0.7818 | 0.9748 | 0.9888 | 0.7258 | 0.9235 |
| 0.0304 | 19.375 | 1240 | 0.0530 | 0.8833 | 0.9230 | 0.9821 | 0.9955 | 0.8111 | 0.9623 | 0.9883 | 0.7384 | 0.9230 |
| 0.0363 | 19.6875 | 1260 | 0.0530 | 0.8838 | 0.9237 | 0.9823 | 0.9951 | 0.8115 | 0.9644 | 0.9885 | 0.7390 | 0.9238 |
| 0.0371 | 20.0 | 1280 | 0.0518 | 0.8861 | 0.9279 | 0.9828 | 0.9940 | 0.8206 | 0.9692 | 0.9891 | 0.7434 | 0.9259 |
| 0.0253 | 20.3125 | 1300 | 0.0541 | 0.8829 | 0.9226 | 0.9824 | 0.9935 | 0.8023 | 0.9720 | 0.9888 | 0.7356 | 0.9245 |
| 0.0296 | 20.625 | 1320 | 0.0533 | 0.8861 | 0.9321 | 0.9824 | 0.9932 | 0.8351 | 0.9681 | 0.9887 | 0.7454 | 0.9243 |
| 0.0306 | 20.9375 | 1340 | 0.0521 | 0.8842 | 0.9254 | 0.9826 | 0.9936 | 0.8112 | 0.9713 | 0.9891 | 0.7381 | 0.9253 |
| 0.0341 | 21.25 | 1360 | 0.0530 | 0.8828 | 0.9217 | 0.9825 | 0.9939 | 0.8001 | 0.9712 | 0.9889 | 0.7347 | 0.9247 |
| 0.0215 | 21.5625 | 1380 | 0.0537 | 0.8840 | 0.9355 | 0.9817 | 0.9954 | 0.8581 | 0.9529 | 0.9881 | 0.7432 | 0.9206 |
| 0.033 | 21.875 | 1400 | 0.0517 | 0.8868 | 0.9319 | 0.9827 | 0.9944 | 0.8369 | 0.9645 | 0.9890 | 0.7462 | 0.9252 |
| 0.0284 | 22.1875 | 1420 | 0.0530 | 0.8840 | 0.9242 | 0.9825 | 0.9938 | 0.8083 | 0.9706 | 0.9889 | 0.7381 | 0.9249 |
| 0.0238 | 22.5 | 1440 | 0.0518 | 0.8864 | 0.9335 | 0.9826 | 0.9949 | 0.8443 | 0.9613 | 0.9890 | 0.7456 | 0.9247 |
| 0.0222 | 22.8125 | 1460 | 0.0541 | 0.8814 | 0.9211 | 0.9823 | 0.9924 | 0.7942 | 0.9766 | 0.9889 | 0.7314 | 0.9240 |
| 0.0263 | 23.125 | 1480 | 0.0528 | 0.8851 | 0.9273 | 0.9826 | 0.9941 | 0.8200 | 0.9677 | 0.9889 | 0.7414 | 0.9249 |
| 0.0246 | 23.4375 | 1500 | 0.0532 | 0.8858 | 0.9317 | 0.9825 | 0.9935 | 0.8343 | 0.9673 | 0.9889 | 0.7437 | 0.9247 |
| 0.0382 | 23.75 | 1520 | 0.0548 | 0.8835 | 0.9276 | 0.9819 | 0.9913 | 0.8164 | 0.9750 | 0.9881 | 0.7399 | 0.9223 |
| 0.02 | 24.0625 | 1540 | 0.0537 | 0.8845 | 0.9271 | 0.9824 | 0.9926 | 0.8158 | 0.9729 | 0.9887 | 0.7406 | 0.9242 |
| 0.0293 | 24.375 | 1560 | 0.0539 | 0.8854 | 0.9300 | 0.9824 | 0.9927 | 0.8261 | 0.9711 | 0.9887 | 0.7433 | 0.9242 |
| 0.0277 | 24.6875 | 1580 | 0.0533 | 0.8854 | 0.9303 | 0.9824 | 0.9929 | 0.8282 | 0.9698 | 0.9887 | 0.7434 | 0.9241 |
| 0.0225 | 25.0 | 1600 | 0.0534 | 0.8854 | 0.9368 | 0.9823 | 0.9937 | 0.8543 | 0.9625 | 0.9889 | 0.7438 | 0.9235 |
| 0.0349 | 25.3125 | 1620 | 0.0535 | 0.8851 | 0.9260 | 0.9827 | 0.9942 | 0.8153 | 0.9686 | 0.9890 | 0.7411 | 0.9252 |
| 0.0258 | 25.625 | 1640 | 0.0527 | 0.8853 | 0.9279 | 0.9826 | 0.9938 | 0.8212 | 0.9686 | 0.9889 | 0.7423 | 0.9248 |
| 0.033 | 25.9375 | 1660 | 0.0522 | 0.8860 | 0.9312 | 0.9826 | 0.9951 | 0.8368 | 0.9618 | 0.9889 | 0.7445 | 0.9247 |
| 0.0202 | 26.25 | 1680 | 0.0518 | 0.8866 | 0.9307 | 0.9828 | 0.9946 | 0.8325 | 0.9649 | 0.9891 | 0.7453 | 0.9255 |
| 0.0246 | 26.5625 | 1700 | 0.0530 | 0.8863 | 0.9369 | 0.9825 | 0.9936 | 0.8535 | 0.9637 | 0.9890 | 0.7457 | 0.9242 |
| 0.0211 | 26.875 | 1720 | 0.0531 | 0.8859 | 0.9289 | 0.9827 | 0.9938 | 0.8240 | 0.9690 | 0.9892 | 0.7429 | 0.9255 |
| 0.0417 | 27.1875 | 1740 | 0.0525 | 0.8862 | 0.9296 | 0.9828 | 0.9935 | 0.8254 | 0.9700 | 0.9891 | 0.7437 | 0.9257 |
| 0.0392 | 27.5 | 1760 | 0.0522 | 0.8868 | 0.9333 | 0.9828 | 0.9939 | 0.8397 | 0.9662 | 0.9892 | 0.7457 | 0.9256 |
| 0.0248 | 27.8125 | 1780 | 0.0531 | 0.8867 | 0.9329 | 0.9827 | 0.9943 | 0.8399 | 0.9645 | 0.9891 | 0.7461 | 0.9251 |
| 0.0255 | 28.125 | 1800 | 0.0540 | 0.8862 | 0.9329 | 0.9825 | 0.9934 | 0.8381 | 0.9673 | 0.9889 | 0.7449 | 0.9247 |
| 0.0233 | 28.4375 | 1820 | 0.0537 | 0.8858 | 0.9296 | 0.9826 | 0.9931 | 0.8251 | 0.9704 | 0.9889 | 0.7435 | 0.9248 |
| 0.0307 | 28.75 | 1840 | 0.0531 | 0.8865 | 0.9299 | 0.9827 | 0.9944 | 0.8291 | 0.9662 | 0.9891 | 0.7450 | 0.9254 |
| 0.0308 | 29.0625 | 1860 | 0.0536 | 0.8867 | 0.9329 | 0.9827 | 0.9939 | 0.8389 | 0.9660 | 0.9890 | 0.7459 | 0.9251 |
| 0.0259 | 29.375 | 1880 | 0.0540 | 0.8850 | 0.9262 | 0.9825 | 0.9945 | 0.8178 | 0.9664 | 0.9888 | 0.7416 | 0.9245 |
| 0.0295 | 29.6875 | 1900 | 0.0545 | 0.8838 | 0.9244 | 0.9824 | 0.9937 | 0.8093 | 0.9703 | 0.9888 | 0.7382 | 0.9243 |
| 0.0197 | 30.0 | 1920 | 0.0539 | 0.8853 | 0.9285 | 0.9825 | 0.9938 | 0.8235 | 0.9683 | 0.9889 | 0.7425 | 0.9247 |
| 0.0369 | 30.3125 | 1940 | 0.0539 | 0.8846 | 0.9269 | 0.9824 | 0.9942 | 0.8195 | 0.9668 | 0.9888 | 0.7407 | 0.9242 |
| 0.0262 | 30.625 | 1960 | 0.0543 | 0.8849 | 0.9287 | 0.9824 | 0.9936 | 0.8241 | 0.9683 | 0.9889 | 0.7415 | 0.9242 |
| 0.0295 | 30.9375 | 1980 | 0.0547 | 0.8845 | 0.9269 | 0.9825 | 0.9932 | 0.8162 | 0.9714 | 0.9889 | 0.7400 | 0.9246 |
| 0.0247 | 31.25 | 2000 | 0.0550 | 0.8855 | 0.9296 | 0.9824 | 0.9943 | 0.8296 | 0.9649 | 0.9887 | 0.7440 | 0.9239 |
| 0.0283 | 31.5625 | 2020 | 0.0552 | 0.8828 | 0.9222 | 0.9823 | 0.9939 | 0.8023 | 0.9705 | 0.9888 | 0.7358 | 0.9240 |
| 0.0333 | 31.875 | 2040 | 0.0543 | 0.8857 | 0.9303 | 0.9825 | 0.9940 | 0.8308 | 0.9660 | 0.9888 | 0.7439 | 0.9244 |
| 0.0256 | 32.1875 | 2060 | 0.0540 | 0.8860 | 0.9365 | 0.9824 | 0.9941 | 0.8535 | 0.9617 | 0.9890 | 0.7450 | 0.9239 |
| 0.0237 | 32.5 | 2080 | 0.0539 | 0.8846 | 0.9241 | 0.9827 | 0.9943 | 0.8083 | 0.9697 | 0.9891 | 0.7390 | 0.9256 |
| 0.0236 | 32.8125 | 2100 | 0.0537 | 0.8855 | 0.9276 | 0.9827 | 0.9937 | 0.8187 | 0.9703 | 0.9891 | 0.7417 | 0.9256 |
| 0.0238 | 33.125 | 2120 | 0.0539 | 0.8849 | 0.9265 | 0.9825 | 0.9947 | 0.8191 | 0.9659 | 0.9889 | 0.7409 | 0.9248 |
| 0.0265 | 33.4375 | 2140 | 0.0543 | 0.8858 | 0.9316 | 0.9825 | 0.9938 | 0.8344 | 0.9664 | 0.9889 | 0.7438 | 0.9246 |
| 0.0274 | 33.75 | 2160 | 0.0555 | 0.8826 | 0.9225 | 0.9824 | 0.9939 | 0.8029 | 0.9706 | 0.9890 | 0.7344 | 0.9245 |
| 0.0232 | 34.0625 | 2180 | 0.0543 | 0.8857 | 0.9316 | 0.9826 | 0.9935 | 0.8336 | 0.9677 | 0.9890 | 0.7434 | 0.9248 |
| 0.0276 | 34.375 | 2200 | 0.0547 | 0.8838 | 0.9240 | 0.9826 | 0.9941 | 0.8082 | 0.9697 | 0.9891 | 0.7373 | 0.9251 |
| 0.033 | 34.6875 | 2220 | 0.0538 | 0.8851 | 0.9267 | 0.9826 | 0.9948 | 0.8198 | 0.9657 | 0.9890 | 0.7413 | 0.9251 |
| 0.0333 | 35.0 | 2240 | 0.0540 | 0.8857 | 0.9291 | 0.9827 | 0.9937 | 0.8247 | 0.9690 | 0.9891 | 0.7426 | 0.9254 |
| 0.0221 | 35.3125 | 2260 | 0.0545 | 0.8856 | 0.9291 | 0.9826 | 0.9941 | 0.8260 | 0.9674 | 0.9891 | 0.7426 | 0.9251 |
| 0.0286 | 35.625 | 2280 | 0.0549 | 0.8852 | 0.9292 | 0.9824 | 0.9940 | 0.8275 | 0.9661 | 0.9887 | 0.7428 | 0.9240 |
| 0.0231 | 35.9375 | 2300 | 0.0545 | 0.8855 | 0.9288 | 0.9826 | 0.9941 | 0.8251 | 0.9673 | 0.9890 | 0.7425 | 0.9250 |
| 0.0301 | 36.25 | 2320 | 0.0544 | 0.8853 | 0.9284 | 0.9825 | 0.9946 | 0.8258 | 0.9650 | 0.9888 | 0.7425 | 0.9245 |
| 0.0311 | 36.5625 | 2340 | 0.0545 | 0.8853 | 0.9289 | 0.9826 | 0.9937 | 0.8245 | 0.9685 | 0.9889 | 0.7422 | 0.9248 |
| 0.0231 | 36.875 | 2360 | 0.0548 | 0.8854 | 0.9284 | 0.9825 | 0.9945 | 0.8257 | 0.9650 | 0.9888 | 0.7430 | 0.9243 |
| 0.0187 | 37.1875 | 2380 | 0.0548 | 0.8859 | 0.9313 | 0.9826 | 0.9941 | 0.8342 | 0.9656 | 0.9890 | 0.7441 | 0.9247 |
| 0.0355 | 37.5 | 2400 | 0.0550 | 0.8846 | 0.9261 | 0.9825 | 0.9945 | 0.8173 | 0.9665 | 0.9889 | 0.7405 | 0.9244 |
| 0.021 | 37.8125 | 2420 | 0.0547 | 0.8857 | 0.9300 | 0.9825 | 0.9940 | 0.8295 | 0.9664 | 0.9889 | 0.7436 | 0.9246 |
| 0.0274 | 38.125 | 2440 | 0.0545 | 0.8854 | 0.9285 | 0.9826 | 0.9940 | 0.8240 | 0.9676 | 0.9890 | 0.7423 | 0.9249 |
| 0.0288 | 38.4375 | 2460 | 0.0545 | 0.8849 | 0.9270 | 0.9826 | 0.9941 | 0.8188 | 0.9682 | 0.9890 | 0.7408 | 0.9250 |
| 0.0315 | 38.75 | 2480 | 0.0548 | 0.8847 | 0.9260 | 0.9826 | 0.9942 | 0.8158 | 0.9681 | 0.9890 | 0.7404 | 0.9248 |
| 0.0221 | 39.0625 | 2500 | 0.0550 | 0.8858 | 0.9295 | 0.9826 | 0.9941 | 0.8276 | 0.9668 | 0.9890 | 0.7435 | 0.9248 |
| 0.021 | 39.375 | 2520 | 0.0552 | 0.8855 | 0.9290 | 0.9826 | 0.9940 | 0.8255 | 0.9674 | 0.9889 | 0.7429 | 0.9248 |
| 0.0261 | 39.6875 | 2540 | 0.0544 | 0.8852 | 0.9274 | 0.9826 | 0.9942 | 0.8208 | 0.9673 | 0.9889 | 0.7419 | 0.9248 |
| 0.0152 | 40.0 | 2560 | 0.0547 | 0.8851 | 0.9274 | 0.9826 | 0.9941 | 0.8203 | 0.9678 | 0.9889 | 0.7417 | 0.9247 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"necrosis",
"root"
] |
Thibaut/route_background_semantic |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# route_background_semantic
This model is a fine-tuned version of [nvidia/segformer-b3-finetuned-cityscapes-1024-1024](https://huggingface.co/nvidia/segformer-b3-finetuned-cityscapes-1024-1024) on the Logiroad/route_background_semantic dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2360
- Mean Iou: 0.1916
- Mean Accuracy: 0.2447
- Overall Accuracy: 0.2962
- Accuracy Unlabeled: nan
- Accuracy Découpe: 0.2865
- Accuracy Reflet météo: 0.0
- Accuracy Autre réparation: 0.3437
- Accuracy Glaçage ou ressuage: 0.0386
- Accuracy Emergence: 0.5549
- Iou Unlabeled: 0.0
- Iou Découpe: 0.2515
- Iou Reflet météo: 0.0
- Iou Autre réparation: 0.3230
- Iou Glaçage ou ressuage: 0.0369
- Iou Emergence: 0.5379
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 10000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Découpe | Accuracy Reflet météo | Accuracy Autre réparation | Accuracy Glaçage ou ressuage | Accuracy Emergence | Iou Unlabeled | Iou Découpe | Iou Reflet météo | Iou Autre réparation | Iou Glaçage ou ressuage | Iou Emergence |
|:-------------:|:------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:---------------------:|:-------------------------:|:----------------------------:|:------------------:|:-------------:|:-----------:|:----------------:|:--------------------:|:-----------------------:|:-------------:|
| 0.2715 | 1.0 | 2427 | 0.2682 | 0.0521 | 0.0669 | 0.1828 | nan | 0.0813 | 0.0 | 0.2533 | 0.0 | 0.0 | 0.0 | 0.0766 | 0.0 | 0.2362 | 0.0 | 0.0 |
| 0.2815 | 2.0 | 4854 | 0.2682 | 0.1165 | 0.1436 | 0.1593 | nan | 0.1108 | 0.0 | 0.1982 | 0.0 | 0.4090 | 0.0 | 0.1014 | 0.0 | 0.1916 | 0.0 | 0.4057 |
| 0.2638 | 3.0 | 7281 | 0.2420 | 0.1664 | 0.2100 | 0.2564 | nan | 0.2346 | 0.0 | 0.3039 | 0.0030 | 0.5085 | 0.0 | 0.2128 | 0.0 | 0.2854 | 0.0030 | 0.4973 |
| 0.2703 | 4.0 | 9708 | 0.2333 | 0.1941 | 0.2475 | 0.3074 | nan | 0.2843 | 0.0 | 0.3612 | 0.0446 | 0.5473 | 0.0 | 0.2512 | 0.0 | 0.3383 | 0.0429 | 0.5320 |
| 0.2197 | 4.1203 | 10000 | 0.2360 | 0.1916 | 0.2447 | 0.2962 | nan | 0.2865 | 0.0 | 0.3437 | 0.0386 | 0.5549 | 0.0 | 0.2515 | 0.0 | 0.3230 | 0.0369 | 0.5379 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.3.0
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"unlabeled",
"découpe",
"reflet météo",
"autre réparation",
"glaçage ou ressuage",
"emergence"
] |
mujerry/mit-b0-03-04-25-15-21_necrosis |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mit-b0-03-04-25-15-21_necrosis
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1459
- Mean Iou: 0.7491
- Mean Accuracy: 0.8053
- Overall Accuracy: 0.9465
- Accuracy Background: 0.9950
- Accuracy Necrosis: 0.5999
- Accuracy Root: 0.8210
- Iou Background: 0.9514
- Iou Necrosis: 0.5271
- Iou Root: 0.7687
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Necrosis | Accuracy Root | Iou Background | Iou Necrosis | Iou Root |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------:|:-------------:|:--------------:|:------------:|:--------:|
| 1.0286 | 0.625 | 20 | 1.0682 | 0.2971 | 0.5482 | 0.4605 | 0.3950 | 0.5573 | 0.6921 | 0.3931 | 0.0404 | 0.4577 |
| 0.8087 | 1.25 | 40 | 0.9066 | 0.4491 | 0.6758 | 0.6734 | 0.6231 | 0.5144 | 0.8901 | 0.6221 | 0.0597 | 0.6654 |
| 0.5207 | 1.875 | 60 | 0.7452 | 0.5011 | 0.7573 | 0.7185 | 0.6773 | 0.7209 | 0.8737 | 0.6762 | 0.0877 | 0.7393 |
| 0.4503 | 2.5 | 80 | 0.5908 | 0.5902 | 0.8478 | 0.8390 | 0.8422 | 0.8815 | 0.8199 | 0.8346 | 0.1719 | 0.7639 |
| 0.3617 | 3.125 | 100 | 0.7355 | 0.4588 | 0.7862 | 0.7203 | 0.6706 | 0.7915 | 0.8965 | 0.6659 | 0.1379 | 0.5726 |
| 0.2934 | 3.75 | 120 | 0.3994 | 0.6875 | 0.8473 | 0.9220 | 0.9489 | 0.7428 | 0.8502 | 0.9244 | 0.3873 | 0.7507 |
| 0.3384 | 4.375 | 140 | 0.3423 | 0.6961 | 0.8540 | 0.9276 | 0.9587 | 0.7662 | 0.8371 | 0.9332 | 0.3891 | 0.7659 |
| 0.3014 | 5.0 | 160 | 0.3397 | 0.6933 | 0.8418 | 0.9209 | 0.9532 | 0.7439 | 0.8282 | 0.9239 | 0.4393 | 0.7167 |
| 0.3316 | 5.625 | 180 | 0.3360 | 0.6717 | 0.8395 | 0.9088 | 0.9339 | 0.7433 | 0.8414 | 0.9076 | 0.4098 | 0.6979 |
| 0.2285 | 6.25 | 200 | 0.2552 | 0.7329 | 0.8140 | 0.9439 | 0.9727 | 0.5721 | 0.8973 | 0.9498 | 0.4681 | 0.7810 |
| 0.1486 | 6.875 | 220 | 0.2143 | 0.7410 | 0.8183 | 0.9470 | 0.9837 | 0.6061 | 0.8651 | 0.9526 | 0.4807 | 0.7898 |
| 0.2264 | 7.5 | 240 | 0.2154 | 0.7168 | 0.7719 | 0.9352 | 0.9956 | 0.5491 | 0.7710 | 0.9366 | 0.4871 | 0.7265 |
| 0.172 | 8.125 | 260 | 0.1926 | 0.7330 | 0.7920 | 0.9444 | 0.9906 | 0.5496 | 0.8358 | 0.9491 | 0.4799 | 0.7700 |
| 0.1841 | 8.75 | 280 | 0.2086 | 0.7252 | 0.8199 | 0.9352 | 0.9926 | 0.7112 | 0.7559 | 0.9405 | 0.5125 | 0.7225 |
| 0.1422 | 9.375 | 300 | 0.1668 | 0.7427 | 0.8002 | 0.9473 | 0.9922 | 0.5676 | 0.8409 | 0.9528 | 0.4972 | 0.7781 |
| 0.1747 | 10.0 | 320 | 0.1877 | 0.7257 | 0.8219 | 0.9401 | 0.9943 | 0.6956 | 0.7756 | 0.9496 | 0.4864 | 0.7409 |
| 0.1492 | 10.625 | 340 | 0.2011 | 0.7252 | 0.8346 | 0.9387 | 0.9887 | 0.7309 | 0.7842 | 0.9485 | 0.4869 | 0.7402 |
| 0.1835 | 11.25 | 360 | 0.1758 | 0.7429 | 0.8284 | 0.9438 | 0.9953 | 0.7006 | 0.7893 | 0.9512 | 0.5228 | 0.7547 |
| 0.2357 | 11.875 | 380 | 0.1770 | 0.7439 | 0.8264 | 0.9441 | 0.9941 | 0.6874 | 0.7978 | 0.9508 | 0.5223 | 0.7587 |
| 0.1232 | 12.5 | 400 | 0.1619 | 0.7534 | 0.8204 | 0.9504 | 0.9918 | 0.6205 | 0.8490 | 0.9575 | 0.5127 | 0.7901 |
| 0.1584 | 13.125 | 420 | 0.1911 | 0.7264 | 0.8091 | 0.9366 | 0.9947 | 0.6714 | 0.7613 | 0.9416 | 0.5111 | 0.7265 |
| 0.2099 | 13.75 | 440 | 0.1593 | 0.7495 | 0.8354 | 0.9485 | 0.9868 | 0.6695 | 0.8500 | 0.9563 | 0.5053 | 0.7869 |
| 0.2723 | 14.375 | 460 | 0.1668 | 0.7353 | 0.8029 | 0.9426 | 0.9906 | 0.5995 | 0.8185 | 0.9477 | 0.4999 | 0.7583 |
| 0.1406 | 15.0 | 480 | 0.1714 | 0.7404 | 0.8172 | 0.9426 | 0.9918 | 0.6555 | 0.8043 | 0.9477 | 0.5170 | 0.7564 |
| 0.1624 | 15.625 | 500 | 0.1527 | 0.7445 | 0.8007 | 0.9511 | 0.9913 | 0.5435 | 0.8674 | 0.9591 | 0.4811 | 0.7934 |
| 0.1519 | 16.25 | 520 | 0.1503 | 0.7459 | 0.8044 | 0.9495 | 0.9936 | 0.5743 | 0.8455 | 0.9568 | 0.4973 | 0.7837 |
| 0.0996 | 16.875 | 540 | 0.1437 | 0.7551 | 0.8216 | 0.9524 | 0.9898 | 0.6065 | 0.8685 | 0.9607 | 0.5066 | 0.7981 |
| 0.3985 | 17.5 | 560 | 0.1520 | 0.7355 | 0.7891 | 0.9472 | 0.9931 | 0.5310 | 0.8431 | 0.9535 | 0.4759 | 0.7771 |
| 0.0979 | 18.125 | 580 | 0.1586 | 0.7447 | 0.8072 | 0.9453 | 0.9940 | 0.6105 | 0.8171 | 0.9502 | 0.5178 | 0.7661 |
| 0.1027 | 18.75 | 600 | 0.1482 | 0.7551 | 0.8261 | 0.9482 | 0.9937 | 0.6605 | 0.8242 | 0.9544 | 0.5341 | 0.7768 |
| 0.1372 | 19.375 | 620 | 0.1504 | 0.7487 | 0.8107 | 0.9463 | 0.9947 | 0.6195 | 0.8180 | 0.9512 | 0.5256 | 0.7693 |
| 0.1219 | 20.0 | 640 | 0.1473 | 0.7558 | 0.8297 | 0.9488 | 0.9944 | 0.6723 | 0.8223 | 0.9556 | 0.5331 | 0.7786 |
| 0.1045 | 20.625 | 660 | 0.1827 | 0.7228 | 0.8050 | 0.9333 | 0.9962 | 0.6811 | 0.7375 | 0.9373 | 0.5231 | 0.7080 |
| 0.1034 | 21.25 | 680 | 0.1534 | 0.7489 | 0.8431 | 0.9451 | 0.9939 | 0.7407 | 0.7946 | 0.9537 | 0.5347 | 0.7583 |
| 0.2095 | 21.875 | 700 | 0.1469 | 0.7401 | 0.7944 | 0.9477 | 0.9935 | 0.5486 | 0.8411 | 0.9543 | 0.4900 | 0.7760 |
| 0.2314 | 22.5 | 720 | 0.1474 | 0.7529 | 0.8222 | 0.9493 | 0.9933 | 0.6387 | 0.8345 | 0.9566 | 0.5206 | 0.7814 |
| 0.1133 | 23.125 | 740 | 0.1645 | 0.7352 | 0.8301 | 0.9404 | 0.9956 | 0.7276 | 0.7671 | 0.9486 | 0.5204 | 0.7366 |
| 0.0817 | 23.75 | 760 | 0.1341 | 0.7563 | 0.8089 | 0.9537 | 0.9936 | 0.5653 | 0.8678 | 0.9612 | 0.5056 | 0.8020 |
| 0.1422 | 24.375 | 780 | 0.1359 | 0.7622 | 0.8287 | 0.9521 | 0.9919 | 0.6404 | 0.8537 | 0.9595 | 0.5333 | 0.7937 |
| 0.0963 | 25.0 | 800 | 0.1450 | 0.7558 | 0.8365 | 0.9487 | 0.9945 | 0.6977 | 0.8172 | 0.9568 | 0.5359 | 0.7748 |
| 0.085 | 25.625 | 820 | 0.1543 | 0.7370 | 0.7929 | 0.9441 | 0.9952 | 0.5704 | 0.8132 | 0.9491 | 0.5024 | 0.7596 |
| 0.2654 | 26.25 | 840 | 0.1599 | 0.7338 | 0.7903 | 0.9424 | 0.9957 | 0.5727 | 0.8027 | 0.9472 | 0.5030 | 0.7511 |
| 0.1506 | 26.875 | 860 | 0.1503 | 0.7478 | 0.8091 | 0.9466 | 0.9940 | 0.6096 | 0.8236 | 0.9526 | 0.5227 | 0.7681 |
| 0.114 | 27.5 | 880 | 0.1440 | 0.7463 | 0.8029 | 0.9488 | 0.9938 | 0.5738 | 0.8412 | 0.9556 | 0.5037 | 0.7795 |
| 0.0748 | 28.125 | 900 | 0.1611 | 0.7362 | 0.7949 | 0.9416 | 0.9956 | 0.5935 | 0.7957 | 0.9461 | 0.5162 | 0.7463 |
| 0.1368 | 28.75 | 920 | 0.1457 | 0.7496 | 0.8074 | 0.9477 | 0.9948 | 0.5993 | 0.8281 | 0.9535 | 0.5214 | 0.7737 |
| 0.0836 | 29.375 | 940 | 0.1622 | 0.7170 | 0.7663 | 0.9408 | 0.9951 | 0.4931 | 0.8106 | 0.9464 | 0.4579 | 0.7469 |
| 0.215 | 30.0 | 960 | 0.1340 | 0.7450 | 0.7959 | 0.9511 | 0.9945 | 0.5366 | 0.8566 | 0.9588 | 0.4857 | 0.7905 |
| 0.1397 | 30.625 | 980 | 0.1370 | 0.7417 | 0.7931 | 0.9495 | 0.9945 | 0.5367 | 0.8480 | 0.9568 | 0.4853 | 0.7830 |
| 0.1468 | 31.25 | 1000 | 0.1373 | 0.7591 | 0.8221 | 0.9507 | 0.9944 | 0.6332 | 0.8387 | 0.9574 | 0.5337 | 0.7863 |
| 0.0733 | 31.875 | 1020 | 0.1380 | 0.7439 | 0.7946 | 0.9500 | 0.9943 | 0.5380 | 0.8515 | 0.9573 | 0.4894 | 0.7851 |
| 0.1454 | 32.5 | 1040 | 0.1414 | 0.7522 | 0.8106 | 0.9487 | 0.9947 | 0.6048 | 0.8324 | 0.9548 | 0.5235 | 0.7781 |
| 0.1203 | 33.125 | 1060 | 0.1459 | 0.7498 | 0.8197 | 0.9467 | 0.9952 | 0.6512 | 0.8126 | 0.9531 | 0.5282 | 0.7681 |
| 0.2697 | 33.75 | 1080 | 0.1381 | 0.7541 | 0.8128 | 0.9494 | 0.9944 | 0.6080 | 0.8359 | 0.9554 | 0.5256 | 0.7815 |
| 0.0884 | 34.375 | 1100 | 0.1629 | 0.7276 | 0.7795 | 0.9403 | 0.9961 | 0.5474 | 0.7950 | 0.9443 | 0.4961 | 0.7426 |
| 0.1911 | 35.0 | 1120 | 0.1395 | 0.7585 | 0.8293 | 0.9501 | 0.9935 | 0.6603 | 0.8341 | 0.9574 | 0.5354 | 0.7827 |
| 0.129 | 35.625 | 1140 | 0.1709 | 0.7278 | 0.7804 | 0.9399 | 0.9955 | 0.5512 | 0.7946 | 0.9440 | 0.4991 | 0.7402 |
| 0.0965 | 36.25 | 1160 | 0.1409 | 0.7472 | 0.7995 | 0.9490 | 0.9948 | 0.5633 | 0.8403 | 0.9552 | 0.5057 | 0.7806 |
| 0.0956 | 36.875 | 1180 | 0.1403 | 0.7389 | 0.7885 | 0.9489 | 0.9942 | 0.5223 | 0.8491 | 0.9559 | 0.4793 | 0.7817 |
| 0.1023 | 37.5 | 1200 | 0.1512 | 0.7438 | 0.8022 | 0.9447 | 0.9952 | 0.6001 | 0.8113 | 0.9495 | 0.5208 | 0.7610 |
| 0.1341 | 38.125 | 1220 | 0.1527 | 0.7422 | 0.7961 | 0.9450 | 0.9953 | 0.5771 | 0.8159 | 0.9496 | 0.5143 | 0.7627 |
| 0.0669 | 38.75 | 1240 | 0.1340 | 0.7459 | 0.7950 | 0.9516 | 0.9938 | 0.5282 | 0.8630 | 0.9591 | 0.4851 | 0.7933 |
| 0.2038 | 39.375 | 1260 | 0.1347 | 0.7572 | 0.8120 | 0.9509 | 0.9946 | 0.5967 | 0.8446 | 0.9568 | 0.5269 | 0.7879 |
| 0.1287 | 40.0 | 1280 | 0.1554 | 0.7415 | 0.7959 | 0.9433 | 0.9957 | 0.5872 | 0.8046 | 0.9471 | 0.5224 | 0.7550 |
| 0.1505 | 40.625 | 1300 | 0.1353 | 0.7470 | 0.7972 | 0.9499 | 0.9946 | 0.5491 | 0.8477 | 0.9563 | 0.4997 | 0.7851 |
| 0.0827 | 41.25 | 1320 | 0.1408 | 0.7522 | 0.8089 | 0.9481 | 0.9947 | 0.6021 | 0.8300 | 0.9535 | 0.5274 | 0.7757 |
| 0.1537 | 41.875 | 1340 | 0.1469 | 0.7468 | 0.8007 | 0.9458 | 0.9950 | 0.5874 | 0.8196 | 0.9504 | 0.5244 | 0.7656 |
| 0.1328 | 42.5 | 1360 | 0.1415 | 0.7490 | 0.8030 | 0.9477 | 0.9948 | 0.5835 | 0.8306 | 0.9531 | 0.5199 | 0.7741 |
| 0.0971 | 43.125 | 1380 | 0.1330 | 0.7578 | 0.8133 | 0.9515 | 0.9943 | 0.5966 | 0.8489 | 0.9580 | 0.5252 | 0.7903 |
| 0.1021 | 43.75 | 1400 | 0.1332 | 0.7520 | 0.8033 | 0.9512 | 0.9945 | 0.5634 | 0.8521 | 0.9579 | 0.5086 | 0.7894 |
| 0.0707 | 44.375 | 1420 | 0.1404 | 0.7496 | 0.8029 | 0.9481 | 0.9952 | 0.5825 | 0.8311 | 0.9536 | 0.5201 | 0.7753 |
| 0.0767 | 45.0 | 1440 | 0.1388 | 0.7520 | 0.8066 | 0.9486 | 0.9949 | 0.5916 | 0.8332 | 0.9543 | 0.5243 | 0.7775 |
| 0.0747 | 45.625 | 1460 | 0.1351 | 0.7594 | 0.8200 | 0.9504 | 0.9944 | 0.6274 | 0.8383 | 0.9567 | 0.5369 | 0.7846 |
| 0.2155 | 46.25 | 1480 | 0.1413 | 0.7549 | 0.8147 | 0.9478 | 0.9949 | 0.6254 | 0.8237 | 0.9531 | 0.5384 | 0.7732 |
| 0.0757 | 46.875 | 1500 | 0.1379 | 0.7560 | 0.8147 | 0.9495 | 0.9944 | 0.6137 | 0.8359 | 0.9555 | 0.5314 | 0.7810 |
| 0.1457 | 47.5 | 1520 | 0.1528 | 0.7459 | 0.8057 | 0.9441 | 0.9955 | 0.6174 | 0.8042 | 0.9484 | 0.5323 | 0.7570 |
| 0.0952 | 48.125 | 1540 | 0.1542 | 0.7467 | 0.8072 | 0.9438 | 0.9955 | 0.6246 | 0.8015 | 0.9479 | 0.5368 | 0.7556 |
| 0.1606 | 48.75 | 1560 | 0.1465 | 0.7526 | 0.8136 | 0.9464 | 0.9950 | 0.6303 | 0.8154 | 0.9512 | 0.5393 | 0.7672 |
| 0.1153 | 49.375 | 1580 | 0.1411 | 0.7511 | 0.8063 | 0.9483 | 0.9946 | 0.5916 | 0.8328 | 0.9539 | 0.5225 | 0.7768 |
| 0.065 | 50.0 | 1600 | 0.1459 | 0.7491 | 0.8053 | 0.9465 | 0.9950 | 0.5999 | 0.8210 | 0.9514 | 0.5271 | 0.7687 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"necrosis",
"root"
] |
Dnq2025/mask2former-finetuned-ER-Mito-LD5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mask2former-finetuned-ER-Mito-LD5
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Loss: 33.3884
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 6450
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 53.498 | 1.0 | 129 | 39.6802 |
| 39.3157 | 2.0 | 258 | 35.9394 |
| 37.2096 | 3.0 | 387 | 32.0225 |
| 31.9877 | 4.0 | 516 | 33.2635 |
| 30.1511 | 5.0 | 645 | 29.8756 |
| 28.3667 | 6.0 | 774 | 30.3257 |
| 26.7492 | 7.0 | 903 | 27.9416 |
| 25.6035 | 8.0 | 1032 | 27.4391 |
| 24.5091 | 9.0 | 1161 | 28.4225 |
| 23.8578 | 10.0 | 1290 | 26.4271 |
| 22.6785 | 11.0 | 1419 | 26.4148 |
| 22.0847 | 12.0 | 1548 | 26.6679 |
| 22.0106 | 13.0 | 1677 | 26.7030 |
| 20.45 | 14.0 | 1806 | 26.1600 |
| 20.1949 | 15.0 | 1935 | 26.2444 |
| 19.1922 | 16.0 | 2064 | 27.0105 |
| 18.9458 | 17.0 | 2193 | 24.9449 |
| 18.46 | 18.0 | 2322 | 27.8372 |
| 17.3966 | 19.0 | 2451 | 27.0517 |
| 17.5908 | 20.0 | 2580 | 28.5696 |
| 16.9413 | 21.0 | 2709 | 27.3707 |
| 16.3963 | 22.0 | 2838 | 26.4041 |
| 16.2948 | 23.0 | 2967 | 25.3316 |
| 16.2511 | 24.0 | 3096 | 27.9766 |
| 15.4496 | 25.0 | 3225 | 27.6993 |
| 15.1992 | 26.0 | 3354 | 27.9919 |
| 14.9445 | 27.0 | 3483 | 25.4937 |
| 14.8226 | 28.0 | 3612 | 28.7659 |
| 14.264 | 29.0 | 3741 | 26.7018 |
| 14.348 | 30.0 | 3870 | 28.9018 |
| 13.936 | 31.0 | 3999 | 28.2813 |
| 13.7577 | 32.0 | 4128 | 30.0501 |
| 13.1629 | 33.0 | 4257 | 28.0087 |
| 14.1035 | 34.0 | 4386 | 28.3435 |
| 13.4379 | 35.0 | 4515 | 28.9629 |
| 12.9478 | 36.0 | 4644 | 29.8509 |
| 12.8114 | 37.0 | 4773 | 28.9036 |
| 13.2322 | 38.0 | 4902 | 29.9045 |
| 12.7433 | 39.0 | 5031 | 31.3430 |
| 12.3428 | 40.0 | 5160 | 31.3746 |
| 12.3295 | 41.0 | 5289 | 31.6009 |
| 12.1459 | 42.0 | 5418 | 31.6387 |
| 11.8999 | 43.0 | 5547 | 32.2195 |
| 12.4076 | 44.0 | 5676 | 32.5034 |
| 11.7797 | 45.0 | 5805 | 32.9062 |
| 11.1345 | 46.0 | 5934 | 32.4447 |
| 12.3552 | 47.0 | 6063 | 32.7274 |
| 11.3111 | 48.0 | 6192 | 33.0397 |
| 11.8742 | 49.0 | 6321 | 33.3195 |
| 11.5268 | 50.0 | 6450 | 33.3223 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
mujerry/mit-b0_whitefly |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mit-b0_whitefly
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2644
- Mean Iou: 0.4948
- Mean Accuracy: 0.4968
- Overall Accuracy: 0.9893
- Accuracy Background: 0.9907
- Accuracy Whitefly: 0.0029
- Iou Background: 0.9893
- Iou Whitefly: 0.0004
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Whitefly | Iou Background | Iou Whitefly |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------:|:--------------:|:------------:|
| 0.6275 | 0.4 | 20 | 0.6849 | 0.3529 | 0.4550 | 0.7049 | 0.7056 | 0.2044 | 0.7048 | 0.0010 |
| 0.4423 | 0.8 | 40 | 0.5826 | 0.4745 | 0.5157 | 0.9468 | 0.9480 | 0.0834 | 0.9468 | 0.0022 |
| 0.3793 | 1.2 | 60 | 0.4444 | 0.4868 | 0.4927 | 0.9731 | 0.9744 | 0.0110 | 0.9731 | 0.0006 |
| 0.3102 | 1.6 | 80 | 0.3347 | 0.4976 | 0.4986 | 0.9949 | 0.9963 | 0.0009 | 0.9949 | 0.0002 |
| 0.272 | 2.0 | 100 | 0.3100 | 0.4983 | 0.4991 | 0.9963 | 0.9977 | 0.0004 | 0.9963 | 0.0002 |
| 0.3003 | 2.4 | 120 | 0.2579 | 0.4983 | 0.4991 | 0.9965 | 0.9979 | 0.0003 | 0.9965 | 0.0001 |
| 0.2558 | 2.8 | 140 | 0.2644 | 0.4948 | 0.4968 | 0.9893 | 0.9907 | 0.0029 | 0.9893 | 0.0004 |
### Framework versions
- Transformers 4.44.1
- Pytorch 2.6.0+cpu
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"background",
"whitefly"
] |
Logiroad/route_background_semantic_x2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# route_background_semantic_x2
This model is a fine-tuned version of [Logiroad/route_background_semantic_x2](https://huggingface.co/Logiroad/route_background_semantic_x2) on the Logiroad/route_background_semantic dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4592
- Mean Iou: 0.2260
- Mean Accuracy: 0.2982
- Overall Accuracy: 0.3342
- Accuracy Unlabeled: nan
- Accuracy Découpe: 0.2100
- Accuracy Reflet météo: 0.1023
- Accuracy Autre réparation: 0.3946
- Accuracy Glaçage ou ressuage: 0.3102
- Accuracy Emergence: 0.4740
- Iou Unlabeled: 0.0
- Iou Découpe: 0.1894
- Iou Reflet météo: 0.1009
- Iou Autre réparation: 0.3656
- Iou Glaçage ou ressuage: 0.2399
- Iou Emergence: 0.4600
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 6
- eval_batch_size: 6
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 200000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Découpe | Accuracy Reflet météo | Accuracy Autre réparation | Accuracy Glaçage ou ressuage | Accuracy Emergence | Iou Unlabeled | Iou Découpe | Iou Reflet météo | Iou Autre réparation | Iou Glaçage ou ressuage | Iou Emergence |
|:-------------:|:--------:|:------:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:---------------------:|:-------------------------:|:----------------------------:|:------------------:|:-------------:|:-----------:|:----------------:|:--------------------:|:-----------------------:|:-------------:|
| 0.1536 | 1.0 | 1618 | 0.3285 | 0.1946 | 0.2516 | 0.2839 | nan | 0.3153 | 0.0910 | 0.3010 | 0.1459 | 0.4049 | 0.0 | 0.2603 | 0.0871 | 0.2855 | 0.1365 | 0.3979 |
| 0.1476 | 2.0 | 3236 | 0.3270 | 0.2135 | 0.2824 | 0.2909 | nan | 0.2356 | 0.0396 | 0.3264 | 0.2896 | 0.5205 | 0.0 | 0.2038 | 0.0381 | 0.3074 | 0.2251 | 0.5063 |
| 0.133 | 3.0 | 4854 | 0.3192 | 0.2185 | 0.2969 | 0.2849 | nan | 0.2856 | 0.0318 | 0.2957 | 0.3583 | 0.5131 | 0.0 | 0.2490 | 0.0310 | 0.2801 | 0.2491 | 0.5021 |
| 0.1196 | 4.0 | 6472 | 0.3359 | 0.2116 | 0.2695 | 0.2911 | nan | 0.2549 | 0.1695 | 0.3208 | 0.1817 | 0.4206 | 0.0 | 0.2237 | 0.1600 | 0.3062 | 0.1659 | 0.4136 |
| 0.135 | 5.0 | 8090 | 0.3212 | 0.1941 | 0.2522 | 0.2972 | nan | 0.1861 | 0.2104 | 0.3464 | 0.2456 | 0.2725 | 0.0 | 0.1775 | 0.1884 | 0.3222 | 0.2075 | 0.2691 |
| 0.1131 | 6.0 | 9708 | 0.3212 | 0.2149 | 0.2833 | 0.2909 | nan | 0.1995 | 0.0422 | 0.3379 | 0.2808 | 0.5560 | 0.0 | 0.1875 | 0.0419 | 0.3168 | 0.2055 | 0.5378 |
| 0.1302 | 7.0 | 11326 | 0.3247 | 0.2326 | 0.3296 | 0.2937 | nan | 0.1116 | 0.2272 | 0.3412 | 0.4223 | 0.5458 | 0.0 | 0.1035 | 0.2134 | 0.3139 | 0.2360 | 0.5290 |
| 0.1277 | 8.0 | 12944 | 0.3144 | 0.2320 | 0.3045 | 0.2709 | nan | 0.2661 | 0.1839 | 0.2734 | 0.2839 | 0.5150 | 0.0 | 0.2167 | 0.1800 | 0.2618 | 0.2326 | 0.5009 |
| 0.1222 | 9.0 | 14562 | 0.3055 | 0.2187 | 0.2814 | 0.2922 | nan | 0.1104 | 0.1301 | 0.3606 | 0.2939 | 0.5121 | 0.0 | 0.1078 | 0.1270 | 0.3341 | 0.2447 | 0.4988 |
| 0.1258 | 10.0 | 16180 | 0.3037 | 0.2630 | 0.3598 | 0.3654 | nan | 0.3144 | 0.2259 | 0.3946 | 0.3186 | 0.5456 | 0.0 | 0.2612 | 0.1954 | 0.3715 | 0.2214 | 0.5285 |
| 0.1214 | 11.0 | 17798 | 0.3165 | 0.2267 | 0.2981 | 0.2951 | nan | 0.1908 | 0.1475 | 0.3383 | 0.2773 | 0.5366 | 0.0 | 0.1717 | 0.1424 | 0.3123 | 0.2133 | 0.5205 |
| 0.1495 | 12.0 | 19416 | 0.3099 | 0.2153 | 0.2804 | 0.2703 | nan | 0.2186 | 0.1025 | 0.2965 | 0.2727 | 0.5117 | 0.0 | 0.1936 | 0.0973 | 0.2800 | 0.2188 | 0.5024 |
| 0.1455 | 13.0 | 21034 | 0.3158 | 0.2191 | 0.2792 | 0.2912 | nan | 0.2227 | 0.1164 | 0.3343 | 0.1687 | 0.5537 | 0.0 | 0.2003 | 0.1105 | 0.3133 | 0.1529 | 0.5374 |
| 0.1207 | 14.0 | 22652 | 0.3487 | 0.2099 | 0.2877 | 0.2788 | nan | 0.1758 | 0.0801 | 0.3184 | 0.3497 | 0.5147 | 0.0 | 0.1602 | 0.0776 | 0.2989 | 0.2403 | 0.4826 |
| 0.1377 | 15.0 | 24270 | 0.3429 | 0.2285 | 0.2958 | 0.3017 | nan | 0.2715 | 0.0783 | 0.3323 | 0.2284 | 0.5685 | 0.0 | 0.2310 | 0.0755 | 0.3111 | 0.2018 | 0.5516 |
| 0.1017 | 16.0 | 25888 | 0.3044 | 0.2232 | 0.2861 | 0.3146 | nan | 0.2510 | 0.0860 | 0.3632 | 0.1736 | 0.5566 | 0.0 | 0.2181 | 0.0811 | 0.3415 | 0.1635 | 0.5349 |
| 0.1143 | 17.0 | 27506 | 0.3295 | 0.1975 | 0.2650 | 0.2489 | nan | 0.1555 | 0.1185 | 0.2812 | 0.3012 | 0.4687 | 0.0 | 0.1383 | 0.1116 | 0.2649 | 0.2162 | 0.4539 |
| 0.1178 | 18.0 | 29124 | 0.3098 | 0.2177 | 0.2892 | 0.3239 | nan | 0.2205 | 0.0896 | 0.3835 | 0.2163 | 0.5359 | 0.0 | 0.1929 | 0.0853 | 0.3525 | 0.1537 | 0.5220 |
| 0.1088 | 19.0 | 30742 | 0.3152 | 0.2153 | 0.2910 | 0.3126 | nan | 0.3170 | 0.0881 | 0.3294 | 0.3120 | 0.4085 | 0.0 | 0.2638 | 0.0835 | 0.3131 | 0.2343 | 0.3970 |
| 0.1179 | 20.0 | 32360 | 0.3254 | 0.2003 | 0.2557 | 0.2795 | nan | 0.1022 | 0.1233 | 0.3542 | 0.1883 | 0.5105 | 0.0 | 0.0989 | 0.1192 | 0.3229 | 0.1651 | 0.4960 |
| 0.1237 | 21.0 | 33978 | 0.3498 | 0.2061 | 0.2710 | 0.2720 | nan | 0.2235 | 0.0811 | 0.3013 | 0.2573 | 0.4918 | 0.0 | 0.1956 | 0.0801 | 0.2871 | 0.1908 | 0.4828 |
| 0.1053 | 22.0 | 35596 | 0.3051 | 0.2482 | 0.3256 | 0.3249 | nan | 0.2774 | 0.2052 | 0.3533 | 0.2449 | 0.5473 | 0.0 | 0.2481 | 0.1926 | 0.3347 | 0.1835 | 0.5300 |
| 0.1246 | 23.0 | 37214 | 0.3448 | 0.2283 | 0.2942 | 0.3091 | nan | 0.2029 | 0.1376 | 0.3601 | 0.2433 | 0.5272 | 0.0 | 0.1901 | 0.1290 | 0.3371 | 0.2034 | 0.5102 |
| 0.1351 | 24.0 | 38832 | 0.3383 | 0.1865 | 0.2490 | 0.3183 | nan | 0.1811 | 0.0292 | 0.3895 | 0.3113 | 0.3339 | 0.0 | 0.1647 | 0.0290 | 0.3641 | 0.2312 | 0.3300 |
| 0.1007 | 25.0 | 40450 | 0.3507 | 0.1966 | 0.2529 | 0.2824 | nan | 0.1047 | 0.0695 | 0.3562 | 0.2720 | 0.4619 | 0.0 | 0.0971 | 0.0685 | 0.3307 | 0.2286 | 0.4548 |
| 0.1173 | 26.0 | 42068 | 0.3579 | 0.2311 | 0.3147 | 0.3175 | nan | 0.3686 | 0.0762 | 0.3216 | 0.2766 | 0.5306 | 0.0 | 0.2827 | 0.0753 | 0.3083 | 0.2020 | 0.5185 |
| 0.1451 | 27.0 | 43686 | 0.3641 | 0.2351 | 0.3139 | 0.2743 | nan | 0.2167 | 0.2535 | 0.2887 | 0.2743 | 0.5362 | 0.0 | 0.1899 | 0.2141 | 0.2740 | 0.2171 | 0.5155 |
| 0.1412 | 28.0 | 45304 | 0.3442 | 0.2226 | 0.2927 | 0.3329 | nan | 0.2880 | 0.0836 | 0.3740 | 0.2592 | 0.4588 | 0.0 | 0.2455 | 0.0827 | 0.3459 | 0.2091 | 0.4524 |
| 0.1147 | 29.0 | 46922 | 0.3662 | 0.1812 | 0.2335 | 0.2758 | nan | 0.1350 | 0.0095 | 0.3448 | 0.2444 | 0.4336 | 0.0 | 0.1272 | 0.0095 | 0.3165 | 0.2057 | 0.4285 |
| 0.1171 | 30.0 | 48540 | 0.3362 | 0.2031 | 0.2592 | 0.2895 | nan | 0.2038 | 0.0681 | 0.3403 | 0.2106 | 0.4734 | 0.0 | 0.1799 | 0.0669 | 0.3197 | 0.1904 | 0.4618 |
| 0.092 | 31.0 | 50158 | 0.3502 | 0.2403 | 0.3151 | 0.3167 | nan | 0.1925 | 0.2658 | 0.3605 | 0.2906 | 0.4660 | 0.0 | 0.1711 | 0.2469 | 0.3321 | 0.2336 | 0.4580 |
| 0.1144 | 32.0 | 51776 | 0.3259 | 0.2194 | 0.2835 | 0.3300 | nan | 0.1992 | 0.0378 | 0.4015 | 0.2510 | 0.5278 | 0.0 | 0.1851 | 0.0374 | 0.3735 | 0.2041 | 0.5164 |
| 0.1063 | 33.0 | 53394 | 0.3323 | 0.2468 | 0.3224 | 0.3813 | nan | 0.1410 | 0.2256 | 0.4799 | 0.2847 | 0.4808 | 0.0 | 0.1277 | 0.2172 | 0.4326 | 0.2418 | 0.4612 |
| 0.0905 | 34.0 | 55012 | 0.3525 | 0.2342 | 0.3062 | 0.3177 | nan | 0.1441 | 0.2233 | 0.3825 | 0.2678 | 0.5135 | 0.0 | 0.1343 | 0.2136 | 0.3507 | 0.2125 | 0.4938 |
| 0.1112 | 35.0 | 56630 | 0.3824 | 0.1652 | 0.2235 | 0.2693 | nan | 0.0766 | 0.0035 | 0.3517 | 0.2814 | 0.4043 | 0.0 | 0.0745 | 0.0035 | 0.3252 | 0.1894 | 0.3983 |
| 0.1043 | 36.0 | 58248 | 0.3510 | 0.1874 | 0.2424 | 0.2676 | nan | 0.1363 | 0.1159 | 0.3228 | 0.2512 | 0.3860 | 0.0 | 0.1249 | 0.1086 | 0.2997 | 0.2101 | 0.3813 |
| 0.0946 | 37.0 | 59866 | 0.3340 | 0.2337 | 0.3170 | 0.3046 | nan | 0.2499 | 0.1323 | 0.3298 | 0.3350 | 0.5382 | 0.0 | 0.2200 | 0.1286 | 0.3105 | 0.2246 | 0.5188 |
| 0.1 | 38.0 | 61484 | 0.3437 | 0.2177 | 0.2853 | 0.3238 | nan | 0.1663 | 0.0574 | 0.4003 | 0.2527 | 0.5497 | 0.0 | 0.1514 | 0.0556 | 0.3648 | 0.1971 | 0.5372 |
| 0.0981 | 39.0 | 63102 | 0.3336 | 0.2382 | 0.3202 | 0.3120 | nan | 0.2613 | 0.2716 | 0.3331 | 0.2587 | 0.4764 | 0.0 | 0.2284 | 0.2479 | 0.3103 | 0.1892 | 0.4532 |
| 0.1124 | 40.0 | 64720 | 0.3413 | 0.2313 | 0.2967 | 0.2924 | nan | 0.1631 | 0.2792 | 0.3376 | 0.2198 | 0.4837 | 0.0 | 0.1491 | 0.2641 | 0.3166 | 0.1893 | 0.4687 |
| 0.0801 | 41.0 | 66338 | 0.3572 | 0.2438 | 0.3242 | 0.3384 | nan | 0.2931 | 0.1708 | 0.3720 | 0.2468 | 0.5383 | 0.0 | 0.2406 | 0.1637 | 0.3464 | 0.1905 | 0.5215 |
| 0.1144 | 42.0 | 67956 | 0.3530 | 0.2161 | 0.2753 | 0.3320 | nan | 0.2198 | 0.0688 | 0.3995 | 0.2190 | 0.4692 | 0.0 | 0.2014 | 0.0679 | 0.3698 | 0.1977 | 0.4598 |
| 0.0817 | 43.0 | 69574 | 0.3602 | 0.2335 | 0.3137 | 0.3095 | nan | 0.3497 | 0.1127 | 0.3126 | 0.2757 | 0.5180 | 0.0 | 0.2783 | 0.1074 | 0.2992 | 0.2122 | 0.5038 |
| 0.1063 | 44.0 | 71192 | 0.3908 | 0.2167 | 0.2789 | 0.2958 | nan | 0.2053 | 0.0701 | 0.3455 | 0.2354 | 0.5382 | 0.0 | 0.1853 | 0.0669 | 0.3270 | 0.2049 | 0.5159 |
| 0.0763 | 45.0 | 72810 | 0.3780 | 0.2017 | 0.2672 | 0.2916 | nan | 0.1444 | 0.0337 | 0.3584 | 0.2827 | 0.5170 | 0.0 | 0.1358 | 0.0336 | 0.3271 | 0.2123 | 0.5015 |
| 0.0836 | 46.0 | 74428 | 0.3590 | 0.2065 | 0.2705 | 0.3276 | nan | 0.1314 | 0.0617 | 0.4173 | 0.2690 | 0.4729 | 0.0 | 0.1221 | 0.0613 | 0.3797 | 0.2120 | 0.4637 |
| 0.1067 | 47.0 | 76046 | 0.3186 | 0.2449 | 0.3231 | 0.3483 | nan | 0.2393 | 0.2747 | 0.3961 | 0.2597 | 0.4459 | 0.0 | 0.2053 | 0.2541 | 0.3662 | 0.2085 | 0.4354 |
| 0.1018 | 48.0 | 77664 | 0.3630 | 0.1991 | 0.2561 | 0.3048 | nan | 0.1744 | 0.0343 | 0.3754 | 0.2208 | 0.4758 | 0.0 | 0.1558 | 0.0340 | 0.3474 | 0.1904 | 0.4670 |
| 0.1003 | 49.0 | 79282 | 0.3545 | 0.1945 | 0.2490 | 0.3086 | nan | 0.1174 | 0.1484 | 0.3910 | 0.2321 | 0.3562 | 0.0 | 0.1115 | 0.1369 | 0.3590 | 0.2088 | 0.3508 |
| 0.0951 | 50.0 | 80900 | 0.3743 | 0.2071 | 0.2698 | 0.2796 | nan | 0.2191 | 0.0901 | 0.3108 | 0.3042 | 0.4249 | 0.0 | 0.1973 | 0.0866 | 0.2914 | 0.2556 | 0.4117 |
| 0.0943 | 51.0 | 82518 | 0.3617 | 0.2232 | 0.2884 | 0.3471 | nan | 0.2586 | 0.0504 | 0.4095 | 0.2395 | 0.4843 | 0.0 | 0.2264 | 0.0495 | 0.3833 | 0.2134 | 0.4666 |
| 0.1014 | 52.0 | 84136 | 0.3768 | 0.2137 | 0.2704 | 0.3245 | nan | 0.1143 | 0.1510 | 0.4168 | 0.1950 | 0.4747 | 0.0 | 0.1083 | 0.1437 | 0.3796 | 0.1862 | 0.4646 |
| 0.0997 | 53.0 | 85754 | 0.3766 | 0.2220 | 0.2871 | 0.3373 | nan | 0.1174 | 0.1707 | 0.4269 | 0.2748 | 0.4454 | 0.0 | 0.1088 | 0.1655 | 0.3906 | 0.2287 | 0.4383 |
| 0.0645 | 54.0 | 87372 | 0.4183 | 0.2126 | 0.2763 | 0.3203 | nan | 0.1631 | 0.0605 | 0.3934 | 0.3020 | 0.4626 | 0.0 | 0.1513 | 0.0592 | 0.3608 | 0.2539 | 0.4505 |
| 0.0742 | 55.0 | 88990 | 0.3843 | 0.2145 | 0.2829 | 0.3019 | nan | 0.2210 | 0.0829 | 0.3464 | 0.2715 | 0.4927 | 0.0 | 0.1962 | 0.0817 | 0.3227 | 0.2069 | 0.4798 |
| 0.0638 | 56.0 | 90608 | 0.4046 | 0.2162 | 0.2815 | 0.2923 | nan | 0.1623 | 0.1480 | 0.3434 | 0.2968 | 0.4570 | 0.0 | 0.1484 | 0.1408 | 0.3183 | 0.2432 | 0.4463 |
| 0.1227 | 57.0 | 92226 | 0.4071 | 0.2013 | 0.2601 | 0.3011 | nan | 0.1607 | 0.0459 | 0.3715 | 0.2361 | 0.4864 | 0.0 | 0.1477 | 0.0454 | 0.3452 | 0.1946 | 0.4748 |
| 0.0796 | 58.0 | 93844 | 0.3865 | 0.1974 | 0.2576 | 0.2997 | nan | 0.1835 | 0.1047 | 0.3566 | 0.2646 | 0.3786 | 0.0 | 0.1677 | 0.1000 | 0.3308 | 0.2178 | 0.3679 |
| 0.0832 | 59.0 | 95462 | 0.3849 | 0.2066 | 0.2746 | 0.3128 | nan | 0.2008 | 0.1288 | 0.3664 | 0.2881 | 0.3889 | 0.0 | 0.1784 | 0.1237 | 0.3400 | 0.2202 | 0.3771 |
| 0.0714 | 60.0 | 97080 | 0.3913 | 0.1992 | 0.2571 | 0.2720 | nan | 0.1706 | 0.0606 | 0.3200 | 0.2670 | 0.4674 | 0.0 | 0.1541 | 0.0602 | 0.2997 | 0.2239 | 0.4576 |
| 0.0882 | 61.0 | 98698 | 0.3769 | 0.2155 | 0.2760 | 0.3161 | nan | 0.1551 | 0.1556 | 0.3867 | 0.2425 | 0.4398 | 0.0 | 0.1431 | 0.1487 | 0.3597 | 0.2093 | 0.4322 |
| 0.078 | 62.0 | 100316 | 0.3816 | 0.2128 | 0.2756 | 0.3190 | nan | 0.1814 | 0.1381 | 0.3862 | 0.2170 | 0.4553 | 0.0 | 0.1665 | 0.1334 | 0.3558 | 0.1750 | 0.4459 |
| 0.0532 | 63.0 | 101934 | 0.4062 | 0.2447 | 0.3230 | 0.3194 | nan | 0.1818 | 0.1862 | 0.3712 | 0.3104 | 0.5654 | 0.0 | 0.1646 | 0.1739 | 0.3445 | 0.2403 | 0.5452 |
| 0.0769 | 64.0 | 103552 | 0.3938 | 0.2195 | 0.2964 | 0.3102 | nan | 0.2051 | 0.1015 | 0.3580 | 0.3204 | 0.4971 | 0.0 | 0.1849 | 0.1001 | 0.3359 | 0.2122 | 0.4840 |
| 0.0679 | 65.0 | 105170 | 0.3898 | 0.2284 | 0.3000 | 0.3276 | nan | 0.1962 | 0.0995 | 0.3875 | 0.3245 | 0.4925 | 0.0 | 0.1802 | 0.0977 | 0.3590 | 0.2512 | 0.4824 |
| 0.0722 | 66.0 | 106788 | 0.3821 | 0.2068 | 0.2743 | 0.2834 | nan | 0.2566 | 0.0561 | 0.3101 | 0.2607 | 0.4880 | 0.0 | 0.2224 | 0.0532 | 0.2914 | 0.1946 | 0.4791 |
| 0.0688 | 67.0 | 108406 | 0.3995 | 0.2203 | 0.2833 | 0.3219 | nan | 0.2287 | 0.0955 | 0.3765 | 0.2414 | 0.4742 | 0.0 | 0.2089 | 0.0918 | 0.3486 | 0.2089 | 0.4632 |
| 0.0583 | 68.0 | 110024 | 0.3796 | 0.2382 | 0.3142 | 0.3523 | nan | 0.3091 | 0.0508 | 0.3947 | 0.2992 | 0.5169 | 0.0 | 0.2659 | 0.0497 | 0.3690 | 0.2430 | 0.5016 |
| 0.0752 | 69.0 | 111642 | 0.3736 | 0.2325 | 0.3029 | 0.3436 | nan | 0.2437 | 0.1286 | 0.4000 | 0.2590 | 0.4832 | 0.0 | 0.2173 | 0.1226 | 0.3703 | 0.2123 | 0.4725 |
| 0.0684 | 70.0 | 113260 | 0.4240 | 0.2417 | 0.3161 | 0.3053 | nan | 0.2741 | 0.2841 | 0.3188 | 0.2518 | 0.4516 | 0.0 | 0.2457 | 0.2532 | 0.3051 | 0.2059 | 0.4405 |
| 0.0576 | 71.0 | 114878 | 0.4012 | 0.2400 | 0.3176 | 0.3716 | nan | 0.2878 | 0.0465 | 0.4306 | 0.3120 | 0.5109 | 0.0 | 0.2588 | 0.0461 | 0.3959 | 0.2386 | 0.5005 |
| 0.0618 | 72.0 | 116496 | 0.4119 | 0.2336 | 0.3094 | 0.3317 | nan | 0.1587 | 0.1409 | 0.4023 | 0.3115 | 0.5335 | 0.0 | 0.1477 | 0.1361 | 0.3675 | 0.2366 | 0.5137 |
| 0.0575 | 73.0 | 118114 | 0.4023 | 0.2235 | 0.2936 | 0.3413 | nan | 0.2253 | 0.0683 | 0.4051 | 0.2925 | 0.4770 | 0.0 | 0.2048 | 0.0669 | 0.3741 | 0.2326 | 0.4629 |
| 0.073 | 74.0 | 119732 | 0.3971 | 0.2178 | 0.2924 | 0.3398 | nan | 0.1959 | 0.0452 | 0.4113 | 0.3229 | 0.4868 | 0.0 | 0.1818 | 0.0447 | 0.3749 | 0.2312 | 0.4744 |
| 0.07 | 75.0 | 121350 | 0.3865 | 0.1997 | 0.2659 | 0.3288 | nan | 0.1304 | 0.0645 | 0.4171 | 0.3100 | 0.4075 | 0.0 | 0.1230 | 0.0614 | 0.3836 | 0.2314 | 0.3986 |
| 0.0552 | 76.0 | 122968 | 0.4262 | 0.2029 | 0.2652 | 0.3223 | nan | 0.1944 | 0.0575 | 0.3902 | 0.2743 | 0.4097 | 0.0 | 0.1786 | 0.0555 | 0.3584 | 0.2237 | 0.4014 |
| 0.0626 | 77.0 | 124586 | 0.4440 | 0.2056 | 0.2702 | 0.2805 | nan | 0.1827 | 0.0676 | 0.3263 | 0.2844 | 0.4900 | 0.0 | 0.1673 | 0.0647 | 0.3046 | 0.2181 | 0.4790 |
| 0.0594 | 78.0 | 126204 | 0.4015 | 0.2068 | 0.2730 | 0.3049 | nan | 0.1606 | 0.0764 | 0.3708 | 0.2801 | 0.4769 | 0.0 | 0.1496 | 0.0742 | 0.3418 | 0.2126 | 0.4625 |
| 0.0551 | 79.0 | 127822 | 0.4051 | 0.2168 | 0.2837 | 0.3178 | nan | 0.1862 | 0.0891 | 0.3808 | 0.2819 | 0.4805 | 0.0 | 0.1727 | 0.0839 | 0.3520 | 0.2252 | 0.4673 |
| 0.0737 | 80.0 | 129440 | 0.4053 | 0.2241 | 0.3040 | 0.3281 | nan | 0.2147 | 0.1083 | 0.3794 | 0.3538 | 0.4637 | 0.0 | 0.1970 | 0.1046 | 0.3515 | 0.2391 | 0.4525 |
| 0.0523 | 81.0 | 131058 | 0.4287 | 0.2331 | 0.3110 | 0.3480 | nan | 0.2120 | 0.1624 | 0.4087 | 0.3273 | 0.4444 | 0.0 | 0.1923 | 0.1553 | 0.3770 | 0.2414 | 0.4328 |
| 0.0522 | 82.0 | 132676 | 0.4198 | 0.2249 | 0.3021 | 0.3204 | nan | 0.2100 | 0.1189 | 0.3689 | 0.3454 | 0.4674 | 0.0 | 0.1854 | 0.1143 | 0.3432 | 0.2533 | 0.4531 |
| 0.0694 | 83.0 | 134294 | 0.4372 | 0.2090 | 0.2779 | 0.3326 | nan | 0.2138 | 0.0558 | 0.3968 | 0.3070 | 0.4163 | 0.0 | 0.1940 | 0.0553 | 0.3669 | 0.2323 | 0.4053 |
| 0.0596 | 84.0 | 135912 | 0.4540 | 0.1884 | 0.2489 | 0.2795 | nan | 0.1759 | 0.0494 | 0.3313 | 0.2761 | 0.4117 | 0.0 | 0.1614 | 0.0491 | 0.3086 | 0.2106 | 0.4011 |
| 0.0523 | 85.0 | 137530 | 0.4173 | 0.2098 | 0.2794 | 0.3306 | nan | 0.2147 | 0.0429 | 0.3955 | 0.2869 | 0.4572 | 0.0 | 0.1909 | 0.0426 | 0.3660 | 0.2199 | 0.4396 |
| 0.0554 | 86.0 | 139148 | 0.4479 | 0.1978 | 0.2672 | 0.2893 | nan | 0.1779 | 0.0756 | 0.3396 | 0.3110 | 0.4316 | 0.0 | 0.1626 | 0.0744 | 0.3166 | 0.2180 | 0.4154 |
| 0.0576 | 87.0 | 140766 | 0.4331 | 0.2185 | 0.2891 | 0.3221 | nan | 0.1979 | 0.1091 | 0.3801 | 0.3079 | 0.4503 | 0.0 | 0.1803 | 0.1080 | 0.3527 | 0.2344 | 0.4357 |
| 0.0636 | 88.0 | 142384 | 0.4258 | 0.2216 | 0.2886 | 0.3187 | nan | 0.2221 | 0.0793 | 0.3699 | 0.3006 | 0.4711 | 0.0 | 0.2002 | 0.0785 | 0.3449 | 0.2466 | 0.4597 |
| 0.0548 | 89.0 | 144002 | 0.4336 | 0.2160 | 0.2791 | 0.3151 | nan | 0.1993 | 0.0941 | 0.3717 | 0.2973 | 0.4330 | 0.0 | 0.1807 | 0.0910 | 0.3446 | 0.2553 | 0.4241 |
| 0.049 | 90.0 | 145620 | 0.4347 | 0.2061 | 0.2675 | 0.3190 | nan | 0.2026 | 0.0675 | 0.3814 | 0.2766 | 0.4093 | 0.0 | 0.1848 | 0.0657 | 0.3541 | 0.2305 | 0.4014 |
| 0.0473 | 91.0 | 147238 | 0.4505 | 0.2204 | 0.2877 | 0.3190 | nan | 0.2445 | 0.0620 | 0.3659 | 0.2843 | 0.4817 | 0.0 | 0.2187 | 0.0603 | 0.3422 | 0.2293 | 0.4717 |
| 0.0549 | 92.0 | 148856 | 0.4442 | 0.2161 | 0.2873 | 0.3250 | nan | 0.1933 | 0.0520 | 0.3897 | 0.3166 | 0.4850 | 0.0 | 0.1778 | 0.0512 | 0.3591 | 0.2369 | 0.4718 |
| 0.0686 | 93.0 | 150474 | 0.4109 | 0.2169 | 0.2884 | 0.3240 | nan | 0.2008 | 0.0503 | 0.3860 | 0.3134 | 0.4913 | 0.0 | 0.1839 | 0.0497 | 0.3558 | 0.2380 | 0.4741 |
| 0.0454 | 94.0 | 152092 | 0.4171 | 0.2247 | 0.2977 | 0.3372 | nan | 0.2248 | 0.0889 | 0.3962 | 0.3025 | 0.4764 | 0.0 | 0.2032 | 0.0870 | 0.3651 | 0.2296 | 0.4634 |
| 0.0524 | 95.0 | 153710 | 0.4433 | 0.2277 | 0.2968 | 0.3373 | nan | 0.2245 | 0.0724 | 0.3978 | 0.3022 | 0.4872 | 0.0 | 0.2036 | 0.0716 | 0.3687 | 0.2469 | 0.4752 |
| 0.0515 | 96.0 | 155328 | 0.4446 | 0.2191 | 0.2932 | 0.3172 | nan | 0.2429 | 0.0641 | 0.3602 | 0.3211 | 0.4779 | 0.0 | 0.2190 | 0.0633 | 0.3379 | 0.2296 | 0.4646 |
| 0.0692 | 97.0 | 156946 | 0.4296 | 0.2129 | 0.2824 | 0.3282 | nan | 0.2163 | 0.0618 | 0.3889 | 0.2959 | 0.4492 | 0.0 | 0.1948 | 0.0612 | 0.3600 | 0.2223 | 0.4394 |
| 0.0472 | 98.0 | 158564 | 0.4504 | 0.2169 | 0.2852 | 0.3254 | nan | 0.2146 | 0.0692 | 0.3841 | 0.2970 | 0.4611 | 0.0 | 0.1946 | 0.0681 | 0.3558 | 0.2348 | 0.4482 |
| 0.0507 | 99.0 | 160182 | 0.4381 | 0.2190 | 0.2889 | 0.3330 | nan | 0.2120 | 0.0546 | 0.3974 | 0.2970 | 0.4835 | 0.0 | 0.1924 | 0.0539 | 0.3667 | 0.2315 | 0.4697 |
| 0.0513 | 100.0 | 161800 | 0.4407 | 0.2197 | 0.2935 | 0.3351 | nan | 0.2145 | 0.0631 | 0.3971 | 0.3206 | 0.4725 | 0.0 | 0.1921 | 0.0623 | 0.3651 | 0.2368 | 0.4619 |
| 0.0434 | 101.0 | 163418 | 0.4556 | 0.2186 | 0.2899 | 0.3188 | nan | 0.1937 | 0.0717 | 0.3780 | 0.3179 | 0.4882 | 0.0 | 0.1762 | 0.0703 | 0.3504 | 0.2412 | 0.4734 |
| 0.0522 | 102.0 | 165036 | 0.4457 | 0.2253 | 0.2959 | 0.3563 | nan | 0.2291 | 0.0720 | 0.4260 | 0.3041 | 0.4481 | 0.0 | 0.2045 | 0.0701 | 0.3936 | 0.2463 | 0.4372 |
| 0.0405 | 103.0 | 166654 | 0.4543 | 0.2206 | 0.2918 | 0.3346 | nan | 0.2086 | 0.0653 | 0.3990 | 0.3096 | 0.4765 | 0.0 | 0.1887 | 0.0639 | 0.3701 | 0.2366 | 0.4642 |
| 0.0555 | 104.0 | 168272 | 0.4456 | 0.2263 | 0.2981 | 0.3301 | nan | 0.2349 | 0.0773 | 0.3827 | 0.3030 | 0.4926 | 0.0 | 0.2114 | 0.0760 | 0.3567 | 0.2355 | 0.4781 |
| 0.0459 | 105.0 | 169890 | 0.4642 | 0.2165 | 0.2848 | 0.3165 | nan | 0.2156 | 0.0592 | 0.3699 | 0.3083 | 0.4708 | 0.0 | 0.1958 | 0.0586 | 0.3459 | 0.2406 | 0.4580 |
| 0.0394 | 106.0 | 171508 | 0.4528 | 0.2266 | 0.2959 | 0.3351 | nan | 0.2186 | 0.0755 | 0.3958 | 0.3042 | 0.4854 | 0.0 | 0.1982 | 0.0738 | 0.3683 | 0.2483 | 0.4708 |
| 0.0486 | 107.0 | 173126 | 0.4553 | 0.2265 | 0.2966 | 0.3380 | nan | 0.2108 | 0.0900 | 0.4015 | 0.3070 | 0.4735 | 0.0 | 0.1910 | 0.0885 | 0.3721 | 0.2470 | 0.4604 |
| 0.0721 | 108.0 | 174744 | 0.4521 | 0.2269 | 0.2989 | 0.3426 | nan | 0.2141 | 0.0980 | 0.4067 | 0.3088 | 0.4669 | 0.0 | 0.1928 | 0.0963 | 0.3759 | 0.2421 | 0.4545 |
| 0.0459 | 109.0 | 176362 | 0.4593 | 0.2264 | 0.2998 | 0.3325 | nan | 0.2102 | 0.1067 | 0.3911 | 0.3148 | 0.4761 | 0.0 | 0.1892 | 0.1050 | 0.3621 | 0.2402 | 0.4617 |
| 0.0516 | 110.0 | 177980 | 0.4499 | 0.2227 | 0.2939 | 0.3310 | nan | 0.2023 | 0.0812 | 0.3940 | 0.3071 | 0.4849 | 0.0 | 0.1841 | 0.0802 | 0.3647 | 0.2373 | 0.4698 |
| 0.0426 | 111.0 | 179598 | 0.4624 | 0.2233 | 0.2931 | 0.3291 | nan | 0.1970 | 0.0981 | 0.3923 | 0.2991 | 0.4790 | 0.0 | 0.1792 | 0.0971 | 0.3629 | 0.2364 | 0.4644 |
| 0.0581 | 112.0 | 181216 | 0.4551 | 0.2220 | 0.2943 | 0.3285 | nan | 0.1975 | 0.1018 | 0.3899 | 0.3096 | 0.4727 | 0.0 | 0.1790 | 0.1004 | 0.3612 | 0.2329 | 0.4586 |
| 0.0353 | 113.0 | 182834 | 0.4592 | 0.2187 | 0.2890 | 0.3332 | nan | 0.1989 | 0.0776 | 0.3994 | 0.3060 | 0.4631 | 0.0 | 0.1809 | 0.0765 | 0.3700 | 0.2345 | 0.4502 |
| 0.0505 | 114.0 | 184452 | 0.4591 | 0.2198 | 0.2912 | 0.3336 | nan | 0.2058 | 0.0756 | 0.3976 | 0.3109 | 0.4659 | 0.0 | 0.1874 | 0.0747 | 0.3688 | 0.2356 | 0.4526 |
| 0.0543 | 115.0 | 186070 | 0.4683 | 0.2223 | 0.2954 | 0.3286 | nan | 0.2183 | 0.0824 | 0.3846 | 0.3137 | 0.4778 | 0.0 | 0.1977 | 0.0811 | 0.3584 | 0.2335 | 0.4631 |
| 0.0501 | 116.0 | 187688 | 0.4676 | 0.2264 | 0.2997 | 0.3346 | nan | 0.2246 | 0.0842 | 0.3917 | 0.3109 | 0.4869 | 0.0 | 0.2023 | 0.0830 | 0.3644 | 0.2375 | 0.4714 |
| 0.0463 | 117.0 | 189306 | 0.4574 | 0.2266 | 0.2995 | 0.3368 | nan | 0.2097 | 0.0955 | 0.3989 | 0.3120 | 0.4814 | 0.0 | 0.1901 | 0.0944 | 0.3693 | 0.2394 | 0.4666 |
| 0.0455 | 118.0 | 190924 | 0.4510 | 0.2295 | 0.3025 | 0.3390 | nan | 0.2146 | 0.1061 | 0.3999 | 0.3092 | 0.4830 | 0.0 | 0.1930 | 0.1046 | 0.3705 | 0.2409 | 0.4682 |
| 0.0443 | 119.0 | 192542 | 0.4614 | 0.2270 | 0.2997 | 0.3368 | nan | 0.2123 | 0.1032 | 0.3976 | 0.3113 | 0.4741 | 0.0 | 0.1912 | 0.1018 | 0.3684 | 0.2406 | 0.4602 |
| 0.0372 | 120.0 | 194160 | 0.4549 | 0.2262 | 0.2986 | 0.3355 | nan | 0.2106 | 0.1016 | 0.3963 | 0.3109 | 0.4737 | 0.0 | 0.1902 | 0.1002 | 0.3671 | 0.2399 | 0.4598 |
| 0.0462 | 121.0 | 195778 | 0.4614 | 0.2267 | 0.2988 | 0.3356 | nan | 0.2076 | 0.1050 | 0.3973 | 0.3085 | 0.4756 | 0.0 | 0.1876 | 0.1035 | 0.3679 | 0.2403 | 0.4609 |
| 0.0586 | 122.0 | 197396 | 0.4545 | 0.2245 | 0.2967 | 0.3348 | nan | 0.2093 | 0.0968 | 0.3961 | 0.3120 | 0.4691 | 0.0 | 0.1890 | 0.0956 | 0.3668 | 0.2399 | 0.4559 |
| 0.0483 | 123.0 | 199014 | 0.4603 | 0.2283 | 0.3013 | 0.3356 | nan | 0.2133 | 0.1062 | 0.3952 | 0.3099 | 0.4818 | 0.0 | 0.1921 | 0.1047 | 0.3663 | 0.2398 | 0.4668 |
| 0.0613 | 123.6094 | 200000 | 0.4592 | 0.2260 | 0.2982 | 0.3342 | nan | 0.2100 | 0.1023 | 0.3946 | 0.3102 | 0.4740 | 0.0 | 0.1894 | 0.1009 | 0.3656 | 0.2399 | 0.4600 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.3.0
- Datasets 3.1.0
- Tokenizers 0.20.3 | [
"unlabeled",
"découpe",
"reflet météo",
"autre réparation",
"glaçage ou ressuage",
"emergence"
] |
Dnq2025/mask2former-finetuned-ER-Mito-LD6 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mask2former-finetuned-ER-Mito-LD6
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Loss: 36.6791
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 6450
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 54.5306 | 1.0 | 129 | 47.6483 |
| 42.7396 | 2.0 | 258 | 34.9192 |
| 38.1013 | 3.0 | 387 | 31.6591 |
| 32.8585 | 4.0 | 516 | 33.2288 |
| 30.481 | 5.0 | 645 | 31.1107 |
| 29.7583 | 6.0 | 774 | 33.4640 |
| 26.7225 | 7.0 | 903 | 28.5952 |
| 26.0046 | 8.0 | 1032 | 29.7853 |
| 24.8025 | 9.0 | 1161 | 28.1448 |
| 24.0444 | 10.0 | 1290 | 29.0327 |
| 23.1171 | 11.0 | 1419 | 27.7611 |
| 22.0008 | 12.0 | 1548 | 27.4982 |
| 22.4006 | 13.0 | 1677 | 28.5804 |
| 20.6462 | 14.0 | 1806 | 26.5733 |
| 20.1936 | 15.0 | 1935 | 27.0318 |
| 19.4983 | 16.0 | 2064 | 26.2896 |
| 19.4915 | 17.0 | 2193 | 27.2829 |
| 18.8834 | 18.0 | 2322 | 27.4219 |
| 17.7948 | 19.0 | 2451 | 26.9262 |
| 17.7806 | 20.0 | 2580 | 28.1830 |
| 17.3828 | 21.0 | 2709 | 26.5501 |
| 16.7812 | 22.0 | 2838 | 27.7953 |
| 16.5225 | 23.0 | 2967 | 26.7844 |
| 16.6727 | 24.0 | 3096 | 29.3165 |
| 15.9375 | 25.0 | 3225 | 29.3433 |
| 15.554 | 26.0 | 3354 | 27.7353 |
| 15.3216 | 27.0 | 3483 | 28.5868 |
| 15.2336 | 28.0 | 3612 | 30.1337 |
| 14.4484 | 29.0 | 3741 | 29.4535 |
| 14.5668 | 30.0 | 3870 | 29.9552 |
| 14.2886 | 31.0 | 3999 | 30.3295 |
| 13.9594 | 32.0 | 4128 | 30.9996 |
| 13.3464 | 33.0 | 4257 | 29.5446 |
| 14.3524 | 34.0 | 4386 | 30.5839 |
| 13.7015 | 35.0 | 4515 | 31.6050 |
| 13.1693 | 36.0 | 4644 | 30.4525 |
| 13.0106 | 37.0 | 4773 | 30.8857 |
| 13.4503 | 38.0 | 4902 | 33.0173 |
| 12.885 | 39.0 | 5031 | 33.0191 |
| 12.4798 | 40.0 | 5160 | 32.4086 |
| 12.569 | 41.0 | 5289 | 35.1227 |
| 12.2572 | 42.0 | 5418 | 33.3447 |
| 12.1342 | 43.0 | 5547 | 34.8180 |
| 12.6542 | 44.0 | 5676 | 34.2102 |
| 11.9929 | 45.0 | 5805 | 35.5142 |
| 11.2777 | 46.0 | 5934 | 36.4062 |
| 12.3835 | 47.0 | 6063 | 36.1198 |
| 11.4719 | 48.0 | 6192 | 36.6292 |
| 12.1422 | 49.0 | 6321 | 36.8263 |
| 11.636 | 50.0 | 6450 | 36.5817 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
Dnq2025/mask2former-finetuned-ER-Mito-LD7 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mask2former-finetuned-ER-Mito-LD7
This model is a fine-tuned version of [facebook/mask2former-swin-large-ade-semantic](https://huggingface.co/facebook/mask2former-swin-large-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Loss: 32.5161
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 6450
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 49.96 | 1.0 | 129 | 38.5311 |
| 37.9944 | 2.0 | 258 | 33.0716 |
| 36.2103 | 3.0 | 387 | 29.7335 |
| 29.2814 | 4.0 | 516 | 31.5442 |
| 27.3516 | 5.0 | 645 | 28.0345 |
| 26.4039 | 6.0 | 774 | 27.9411 |
| 24.3475 | 7.0 | 903 | 26.4108 |
| 23.4804 | 8.0 | 1032 | 26.8475 |
| 22.4815 | 9.0 | 1161 | 25.9447 |
| 21.721 | 10.0 | 1290 | 27.1656 |
| 20.8443 | 11.0 | 1419 | 33.2659 |
| 19.9949 | 12.0 | 1548 | 26.9611 |
| 19.893 | 13.0 | 1677 | 26.0445 |
| 18.1322 | 14.0 | 1806 | 27.2854 |
| 18.1679 | 15.0 | 1935 | 25.4194 |
| 17.4814 | 16.0 | 2064 | 25.4006 |
| 17.402 | 17.0 | 2193 | 24.8677 |
| 17.0285 | 18.0 | 2322 | 25.9922 |
| 15.8946 | 19.0 | 2451 | 27.1687 |
| 15.8518 | 20.0 | 2580 | 29.3397 |
| 15.4202 | 21.0 | 2709 | 25.7427 |
| 14.9686 | 22.0 | 2838 | 28.8585 |
| 14.7436 | 23.0 | 2967 | 27.9649 |
| 15.1461 | 24.0 | 3096 | 27.5371 |
| 14.3666 | 25.0 | 3225 | 27.2910 |
| 13.9871 | 26.0 | 3354 | 28.4562 |
| 13.8003 | 27.0 | 3483 | 27.0616 |
| 13.7903 | 28.0 | 3612 | 33.0673 |
| 13.2151 | 29.0 | 3741 | 28.1574 |
| 13.2489 | 30.0 | 3870 | 27.9714 |
| 12.9787 | 31.0 | 3999 | 29.6233 |
| 12.8853 | 32.0 | 4128 | 32.2755 |
| 12.5442 | 33.0 | 4257 | 30.2798 |
| 13.3521 | 34.0 | 4386 | 28.7282 |
| 12.609 | 35.0 | 4515 | 27.4472 |
| 12.1436 | 36.0 | 4644 | 28.6240 |
| 12.0534 | 37.0 | 4773 | 28.3248 |
| 12.4731 | 38.0 | 4902 | 31.0330 |
| 12.0568 | 39.0 | 5031 | 33.3478 |
| 11.7165 | 40.0 | 5160 | 32.6755 |
| 11.7194 | 41.0 | 5289 | 32.9583 |
| 11.5118 | 42.0 | 5418 | 31.8171 |
| 11.2862 | 43.0 | 5547 | 30.4766 |
| 11.8368 | 44.0 | 5676 | 30.9541 |
| 11.2132 | 45.0 | 5805 | 32.4065 |
| 10.606 | 46.0 | 5934 | 31.5392 |
| 11.7442 | 47.0 | 6063 | 31.8038 |
| 10.7855 | 48.0 | 6192 | 32.5302 |
| 11.3661 | 49.0 | 6321 | 32.8178 |
| 10.9675 | 50.0 | 6450 | 32.5147 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
TommyClas/phaseseg_models |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# phaseseg_models
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the TommyClas/phase_seg dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0372
- Mean Iou: 0.9744
- Mean Accuracy: 0.9872
- Overall Accuracy: 0.9869
- Accuracy 背景: nan
- Accuracy 未水化水泥颗粒: 0.9806
- Accuracy 孔隙: 0.9893
- Accuracy 氢氧化钙: 0.9901
- Accuracy 其他水化物: 0.9887
- Iou 背景: nan
- Iou 未水化水泥颗粒: 0.9730
- Iou 孔隙: 0.9695
- Iou 氢氧化钙: 0.9767
- Iou 其他水化物: 0.9782
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 10000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 背景 | Accuracy 未水化水泥颗粒 | Accuracy 孔隙 | Accuracy 氢氧化钙 | Accuracy 其他水化物 | Iou 背景 | Iou 未水化水泥颗粒 | Iou 孔隙 | Iou 氢氧化钙 | Iou 其他水化物 |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-----------:|:----------------:|:-----------:|:-------------:|:--------------:|:------:|:-----------:|:------:|:--------:|:---------:|
| No log | 1.0 | 50 | 0.2994 | 0.7375 | 0.9586 | 0.9580 | nan | 0.9557 | 0.9290 | 0.9696 | 0.9802 | 0.0 | 0.9224 | 0.9066 | 0.9195 | 0.9392 |
| 0.4501 | 2.0 | 100 | 0.1558 | 0.7645 | 0.9767 | 0.9766 | nan | 0.9802 | 0.9580 | 0.9758 | 0.9929 | 0.0 | 0.9585 | 0.9504 | 0.9524 | 0.9609 |
| 0.4501 | 3.0 | 150 | 0.1193 | 0.7715 | 0.9814 | 0.9812 | nan | 0.9797 | 0.9718 | 0.9829 | 0.9912 | 0.0 | 0.9661 | 0.9628 | 0.9607 | 0.9680 |
| 0.0949 | 4.0 | 200 | 0.0898 | 0.7745 | 0.9835 | 0.9834 | nan | 0.9844 | 0.9751 | 0.9842 | 0.9902 | 0.0 | 0.9702 | 0.9667 | 0.9655 | 0.9699 |
| 0.0949 | 5.0 | 250 | 0.0766 | 0.7762 | 0.9848 | 0.9848 | nan | 0.9848 | 0.9799 | 0.9842 | 0.9905 | 0.0 | 0.9729 | 0.9696 | 0.9674 | 0.9713 |
| 0.0584 | 6.0 | 300 | 0.0624 | 0.7771 | 0.9856 | 0.9855 | nan | 0.9865 | 0.9802 | 0.9852 | 0.9905 | 0.0 | 0.9747 | 0.9704 | 0.9684 | 0.9723 |
| 0.0584 | 7.0 | 350 | 0.0628 | 0.7777 | 0.9859 | 0.9858 | nan | 0.9845 | 0.9817 | 0.9865 | 0.9907 | 0.0 | 0.9743 | 0.9717 | 0.9695 | 0.9731 |
| 0.0441 | 8.0 | 400 | 0.0575 | 0.7784 | 0.9863 | 0.9863 | nan | 0.9852 | 0.9841 | 0.9846 | 0.9914 | 0.0 | 0.9750 | 0.9732 | 0.9709 | 0.9732 |
| 0.0441 | 9.0 | 450 | 0.0500 | 0.7788 | 0.9867 | 0.9866 | nan | 0.9855 | 0.9847 | 0.9839 | 0.9925 | 0.0 | 0.9762 | 0.9738 | 0.9706 | 0.9734 |
| 0.0363 | 10.0 | 500 | 0.0496 | 0.7795 | 0.9870 | 0.9869 | nan | 0.9841 | 0.9859 | 0.9875 | 0.9905 | 0.0 | 0.9753 | 0.9745 | 0.9726 | 0.9751 |
| 0.0363 | 11.0 | 550 | 0.0458 | 0.7798 | 0.9873 | 0.9872 | nan | 0.9844 | 0.9863 | 0.9875 | 0.9910 | 0.0 | 0.9758 | 0.9749 | 0.9731 | 0.9755 |
| 0.0315 | 12.0 | 600 | 0.0423 | 0.7802 | 0.9875 | 0.9875 | nan | 0.9872 | 0.9845 | 0.9895 | 0.9891 | 0.0 | 0.9771 | 0.9750 | 0.9731 | 0.9757 |
| 0.0315 | 13.0 | 650 | 0.0437 | 0.7800 | 0.9874 | 0.9873 | nan | 0.9851 | 0.9848 | 0.9891 | 0.9908 | 0.0 | 0.9762 | 0.9749 | 0.9731 | 0.9760 |
| 0.0278 | 14.0 | 700 | 0.0390 | 0.7805 | 0.9878 | 0.9877 | nan | 0.9862 | 0.9859 | 0.9874 | 0.9916 | 0.0 | 0.9772 | 0.9753 | 0.9738 | 0.9762 |
| 0.0278 | 15.0 | 750 | 0.0404 | 0.7799 | 0.9874 | 0.9873 | nan | 0.9834 | 0.9872 | 0.9896 | 0.9896 | 0.0 | 0.9753 | 0.9738 | 0.9740 | 0.9764 |
| 0.0255 | 16.0 | 800 | 0.0422 | 0.7789 | 0.9868 | 0.9866 | nan | 0.9809 | 0.9872 | 0.9878 | 0.9911 | 0.0 | 0.9725 | 0.9709 | 0.9745 | 0.9765 |
| 0.0255 | 17.0 | 850 | 0.0387 | 0.7794 | 0.9871 | 0.9869 | nan | 0.9831 | 0.9858 | 0.9900 | 0.9895 | 0.0 | 0.9742 | 0.9720 | 0.9739 | 0.9767 |
| 0.0235 | 18.0 | 900 | 0.0395 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9810 | 0.9882 | 0.9881 | 0.9903 | 0.0 | 0.9725 | 0.9706 | 0.9751 | 0.9770 |
| 0.0235 | 19.0 | 950 | 0.0364 | 0.7790 | 0.9868 | 0.9866 | nan | 0.9809 | 0.9886 | 0.9867 | 0.9911 | 0.0 | 0.9723 | 0.9706 | 0.9752 | 0.9769 |
| 0.0221 | 20.0 | 1000 | 0.0394 | 0.7785 | 0.9865 | 0.9863 | nan | 0.9801 | 0.9870 | 0.9887 | 0.9904 | 0.0 | 0.9713 | 0.9691 | 0.9751 | 0.9769 |
| 0.0221 | 21.0 | 1050 | 0.0374 | 0.7787 | 0.9866 | 0.9864 | nan | 0.9812 | 0.9873 | 0.9871 | 0.9910 | 0.0 | 0.9720 | 0.9697 | 0.9750 | 0.9768 |
| 0.021 | 22.0 | 1100 | 0.0364 | 0.7787 | 0.9867 | 0.9865 | nan | 0.9804 | 0.9874 | 0.9884 | 0.9906 | 0.0 | 0.9718 | 0.9695 | 0.9753 | 0.9771 |
| 0.021 | 23.0 | 1150 | 0.0375 | 0.7784 | 0.9865 | 0.9863 | nan | 0.9792 | 0.9883 | 0.9888 | 0.9897 | 0.0 | 0.9708 | 0.9687 | 0.9754 | 0.9774 |
| 0.0199 | 24.0 | 1200 | 0.0371 | 0.7782 | 0.9864 | 0.9861 | nan | 0.9792 | 0.9871 | 0.9878 | 0.9913 | 0.0 | 0.9709 | 0.9684 | 0.9749 | 0.9768 |
| 0.0199 | 25.0 | 1250 | 0.0393 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9788 | 0.9885 | 0.9890 | 0.9897 | 0.0 | 0.9707 | 0.9683 | 0.9754 | 0.9776 |
| 0.0191 | 26.0 | 1300 | 0.0387 | 0.7783 | 0.9865 | 0.9862 | nan | 0.9791 | 0.9878 | 0.9904 | 0.9887 | 0.0 | 0.9709 | 0.9683 | 0.9750 | 0.9775 |
| 0.0191 | 27.0 | 1350 | 0.0384 | 0.7785 | 0.9865 | 0.9863 | nan | 0.9794 | 0.9880 | 0.9897 | 0.9890 | 0.0 | 0.9711 | 0.9685 | 0.9754 | 0.9775 |
| 0.0188 | 28.0 | 1400 | 0.0383 | 0.7783 | 0.9865 | 0.9862 | nan | 0.9779 | 0.9893 | 0.9884 | 0.9903 | 0.0 | 0.9705 | 0.9682 | 0.9754 | 0.9776 |
| 0.0188 | 29.0 | 1450 | 0.0377 | 0.7784 | 0.9864 | 0.9862 | nan | 0.9785 | 0.9902 | 0.9890 | 0.9880 | 0.0 | 0.9703 | 0.9680 | 0.9759 | 0.9775 |
| 0.018 | 30.0 | 1500 | 0.0378 | 0.9732 | 0.9866 | 0.9863 | nan | 0.9794 | 0.9885 | 0.9888 | 0.9895 | nan | 0.9710 | 0.9683 | 0.9757 | 0.9777 |
| 0.018 | 31.0 | 1550 | 0.0379 | 0.9730 | 0.9865 | 0.9862 | nan | 0.9794 | 0.9875 | 0.9901 | 0.9890 | nan | 0.9710 | 0.9681 | 0.9753 | 0.9776 |
| 0.0175 | 32.0 | 1600 | 0.0381 | 0.9730 | 0.9865 | 0.9862 | nan | 0.9792 | 0.9884 | 0.9894 | 0.9889 | nan | 0.9708 | 0.9682 | 0.9755 | 0.9775 |
| 0.0175 | 33.0 | 1650 | 0.0394 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9783 | 0.9896 | 0.9894 | 0.9886 | 0.0 | 0.9705 | 0.9679 | 0.9758 | 0.9777 |
| 0.0171 | 34.0 | 1700 | 0.0390 | 0.7784 | 0.9865 | 0.9863 | nan | 0.9800 | 0.9871 | 0.9902 | 0.9887 | 0.0 | 0.9712 | 0.9682 | 0.9753 | 0.9775 |
| 0.0171 | 35.0 | 1750 | 0.0385 | 0.9729 | 0.9865 | 0.9862 | nan | 0.9790 | 0.9878 | 0.9892 | 0.9899 | nan | 0.9710 | 0.9680 | 0.9754 | 0.9774 |
| 0.0166 | 36.0 | 1800 | 0.0384 | 0.9731 | 0.9865 | 0.9863 | nan | 0.9791 | 0.9884 | 0.9889 | 0.9897 | nan | 0.9711 | 0.9682 | 0.9756 | 0.9775 |
| 0.0166 | 37.0 | 1850 | 0.0389 | 0.9730 | 0.9865 | 0.9862 | nan | 0.9794 | 0.9875 | 0.9891 | 0.9898 | nan | 0.9711 | 0.9680 | 0.9754 | 0.9775 |
| 0.0162 | 38.0 | 1900 | 0.0375 | 0.9731 | 0.9865 | 0.9863 | nan | 0.9797 | 0.9879 | 0.9901 | 0.9884 | nan | 0.9711 | 0.9681 | 0.9755 | 0.9777 |
| 0.0162 | 39.0 | 1950 | 0.0389 | 0.9731 | 0.9866 | 0.9863 | nan | 0.9786 | 0.9891 | 0.9891 | 0.9894 | nan | 0.9709 | 0.9681 | 0.9759 | 0.9776 |
| 0.0158 | 40.0 | 2000 | 0.0396 | 0.9730 | 0.9865 | 0.9862 | nan | 0.9783 | 0.9897 | 0.9894 | 0.9886 | nan | 0.9705 | 0.9678 | 0.9761 | 0.9777 |
| 0.0158 | 41.0 | 2050 | 0.0397 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9788 | 0.9889 | 0.9887 | 0.9895 | 0.0 | 0.9708 | 0.9679 | 0.9759 | 0.9773 |
| 0.0156 | 42.0 | 2100 | 0.0401 | 0.9730 | 0.9865 | 0.9862 | nan | 0.9782 | 0.9889 | 0.9890 | 0.9898 | nan | 0.9707 | 0.9678 | 0.9758 | 0.9775 |
| 0.0156 | 43.0 | 2150 | 0.0399 | 0.9730 | 0.9865 | 0.9862 | nan | 0.9789 | 0.9886 | 0.9896 | 0.9889 | nan | 0.9708 | 0.9678 | 0.9757 | 0.9777 |
| 0.0154 | 44.0 | 2200 | 0.0407 | 0.9728 | 0.9864 | 0.9861 | nan | 0.9781 | 0.9900 | 0.9884 | 0.9891 | nan | 0.9702 | 0.9673 | 0.9762 | 0.9776 |
| 0.0154 | 45.0 | 2250 | 0.0405 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9785 | 0.9901 | 0.9896 | 0.9877 | 0.0 | 0.9706 | 0.9675 | 0.9761 | 0.9776 |
| 0.0151 | 46.0 | 2300 | 0.0411 | 0.7782 | 0.9864 | 0.9861 | nan | 0.9784 | 0.9903 | 0.9901 | 0.9866 | 0.0 | 0.9704 | 0.9673 | 0.9758 | 0.9775 |
| 0.0151 | 47.0 | 2350 | 0.0394 | 0.9732 | 0.9866 | 0.9863 | nan | 0.9790 | 0.9896 | 0.9890 | 0.9886 | nan | 0.9709 | 0.9681 | 0.9759 | 0.9777 |
| 0.015 | 48.0 | 2400 | 0.0405 | 0.7784 | 0.9865 | 0.9863 | nan | 0.9787 | 0.9885 | 0.9892 | 0.9898 | 0.0 | 0.9708 | 0.9677 | 0.9757 | 0.9780 |
| 0.015 | 49.0 | 2450 | 0.0399 | 0.7786 | 0.9866 | 0.9863 | nan | 0.9787 | 0.9905 | 0.9882 | 0.9888 | 0.0 | 0.9707 | 0.9678 | 0.9764 | 0.9779 |
| 0.0149 | 50.0 | 2500 | 0.0410 | 0.7783 | 0.9864 | 0.9861 | nan | 0.9781 | 0.9895 | 0.9889 | 0.9891 | 0.0 | 0.9705 | 0.9673 | 0.9761 | 0.9776 |
| 0.0149 | 51.0 | 2550 | 0.0405 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9793 | 0.9898 | 0.9895 | 0.9872 | 0.0 | 0.9707 | 0.9676 | 0.9763 | 0.9776 |
| 0.0145 | 52.0 | 2600 | 0.0402 | 0.7785 | 0.9866 | 0.9863 | nan | 0.9788 | 0.9895 | 0.9893 | 0.9887 | 0.0 | 0.9710 | 0.9678 | 0.9760 | 0.9778 |
| 0.0145 | 53.0 | 2650 | 0.0401 | 0.7786 | 0.9866 | 0.9864 | nan | 0.9791 | 0.9889 | 0.9898 | 0.9887 | 0.0 | 0.9710 | 0.9680 | 0.9761 | 0.9780 |
| 0.0144 | 54.0 | 2700 | 0.0392 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9795 | 0.9888 | 0.9887 | 0.9896 | 0.0 | 0.9714 | 0.9682 | 0.9761 | 0.9777 |
| 0.0144 | 55.0 | 2750 | 0.0409 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9787 | 0.9886 | 0.9895 | 0.9891 | 0.0 | 0.9706 | 0.9675 | 0.9760 | 0.9777 |
| 0.0141 | 56.0 | 2800 | 0.0410 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9779 | 0.9897 | 0.9897 | 0.9887 | 0.0 | 0.9707 | 0.9675 | 0.9759 | 0.9778 |
| 0.0141 | 57.0 | 2850 | 0.0412 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9780 | 0.9898 | 0.9891 | 0.9891 | 0.0 | 0.9707 | 0.9676 | 0.9761 | 0.9776 |
| 0.014 | 58.0 | 2900 | 0.0403 | 0.9732 | 0.9866 | 0.9863 | nan | 0.9794 | 0.9889 | 0.9889 | 0.9891 | nan | 0.9713 | 0.9680 | 0.9761 | 0.9775 |
| 0.014 | 59.0 | 2950 | 0.0404 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9787 | 0.9899 | 0.9889 | 0.9892 | 0.0 | 0.9711 | 0.9680 | 0.9763 | 0.9779 |
| 0.0139 | 60.0 | 3000 | 0.0412 | 0.7783 | 0.9865 | 0.9862 | nan | 0.9786 | 0.9893 | 0.9900 | 0.9879 | 0.0 | 0.9708 | 0.9675 | 0.9758 | 0.9775 |
| 0.0139 | 61.0 | 3050 | 0.0410 | 0.7785 | 0.9866 | 0.9863 | nan | 0.9789 | 0.9893 | 0.9901 | 0.9879 | 0.0 | 0.9708 | 0.9676 | 0.9762 | 0.9780 |
| 0.0138 | 62.0 | 3100 | 0.0413 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9778 | 0.9896 | 0.9893 | 0.9894 | 0.0 | 0.9705 | 0.9675 | 0.9763 | 0.9779 |
| 0.0138 | 63.0 | 3150 | 0.0400 | 0.7786 | 0.9866 | 0.9863 | nan | 0.9794 | 0.9887 | 0.9908 | 0.9874 | 0.0 | 0.9715 | 0.9681 | 0.9757 | 0.9776 |
| 0.0138 | 64.0 | 3200 | 0.0401 | 0.7786 | 0.9866 | 0.9864 | nan | 0.9800 | 0.9888 | 0.9904 | 0.9873 | 0.0 | 0.9715 | 0.9682 | 0.9758 | 0.9776 |
| 0.0138 | 65.0 | 3250 | 0.0414 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9788 | 0.9888 | 0.9905 | 0.9879 | 0.0 | 0.9708 | 0.9675 | 0.9759 | 0.9776 |
| 0.0136 | 66.0 | 3300 | 0.0397 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9796 | 0.9895 | 0.9897 | 0.9880 | 0.0 | 0.9714 | 0.9683 | 0.9763 | 0.9776 |
| 0.0136 | 67.0 | 3350 | 0.0417 | 0.7783 | 0.9864 | 0.9861 | nan | 0.9777 | 0.9894 | 0.9903 | 0.9884 | 0.0 | 0.9702 | 0.9671 | 0.9761 | 0.9779 |
| 0.0135 | 68.0 | 3400 | 0.0409 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9790 | 0.9898 | 0.9908 | 0.9862 | 0.0 | 0.9711 | 0.9678 | 0.9758 | 0.9773 |
| 0.0135 | 69.0 | 3450 | 0.0399 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9796 | 0.9896 | 0.9887 | 0.9888 | 0.0 | 0.9714 | 0.9681 | 0.9764 | 0.9778 |
| 0.0133 | 70.0 | 3500 | 0.0407 | 0.7785 | 0.9865 | 0.9863 | nan | 0.9792 | 0.9903 | 0.9901 | 0.9866 | 0.0 | 0.9713 | 0.9676 | 0.9761 | 0.9775 |
| 0.0133 | 71.0 | 3550 | 0.0407 | 0.7786 | 0.9866 | 0.9864 | nan | 0.9787 | 0.9896 | 0.9892 | 0.9890 | 0.0 | 0.9712 | 0.9679 | 0.9761 | 0.9778 |
| 0.0131 | 72.0 | 3600 | 0.0394 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9790 | 0.9899 | 0.9895 | 0.9887 | 0.0 | 0.9714 | 0.9681 | 0.9766 | 0.9781 |
| 0.0131 | 73.0 | 3650 | 0.0410 | 0.7785 | 0.9865 | 0.9863 | nan | 0.9796 | 0.9897 | 0.9903 | 0.9865 | 0.0 | 0.9713 | 0.9678 | 0.9759 | 0.9774 |
| 0.0132 | 74.0 | 3700 | 0.0412 | 0.7785 | 0.9866 | 0.9863 | nan | 0.9791 | 0.9900 | 0.9901 | 0.9871 | 0.0 | 0.9713 | 0.9678 | 0.9761 | 0.9774 |
| 0.0132 | 75.0 | 3750 | 0.0412 | 0.7786 | 0.9866 | 0.9863 | nan | 0.9785 | 0.9902 | 0.9898 | 0.9879 | 0.0 | 0.9711 | 0.9676 | 0.9763 | 0.9779 |
| 0.0131 | 76.0 | 3800 | 0.0396 | 0.7786 | 0.9866 | 0.9864 | nan | 0.9798 | 0.9893 | 0.9904 | 0.9870 | 0.0 | 0.9716 | 0.9682 | 0.9760 | 0.9775 |
| 0.0131 | 77.0 | 3850 | 0.0418 | 0.7784 | 0.9865 | 0.9862 | nan | 0.9789 | 0.9896 | 0.9905 | 0.9871 | 0.0 | 0.9711 | 0.9676 | 0.9760 | 0.9775 |
| 0.013 | 78.0 | 3900 | 0.0396 | 0.7786 | 0.9866 | 0.9864 | nan | 0.9787 | 0.9899 | 0.9906 | 0.9872 | 0.0 | 0.9713 | 0.9678 | 0.9760 | 0.9779 |
| 0.013 | 79.0 | 3950 | 0.0398 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9794 | 0.9898 | 0.9905 | 0.9869 | 0.0 | 0.9715 | 0.9680 | 0.9762 | 0.9777 |
| 0.0128 | 80.0 | 4000 | 0.0402 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9789 | 0.9898 | 0.9896 | 0.9885 | 0.0 | 0.9714 | 0.9680 | 0.9765 | 0.9779 |
| 0.0128 | 81.0 | 4050 | 0.0404 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9787 | 0.9903 | 0.9902 | 0.9874 | 0.0 | 0.9713 | 0.9677 | 0.9763 | 0.9779 |
| 0.0127 | 82.0 | 4100 | 0.0397 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9794 | 0.9896 | 0.9901 | 0.9877 | 0.0 | 0.9716 | 0.9681 | 0.9762 | 0.9778 |
| 0.0127 | 83.0 | 4150 | 0.0411 | 0.7786 | 0.9866 | 0.9863 | nan | 0.9786 | 0.9898 | 0.9899 | 0.9881 | 0.0 | 0.9712 | 0.9677 | 0.9763 | 0.9778 |
| 0.0127 | 84.0 | 4200 | 0.0406 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9787 | 0.9903 | 0.9890 | 0.9889 | 0.0 | 0.9713 | 0.9680 | 0.9766 | 0.9781 |
| 0.0127 | 85.0 | 4250 | 0.0413 | 0.7786 | 0.9866 | 0.9864 | nan | 0.9787 | 0.9900 | 0.9888 | 0.9891 | 0.0 | 0.9711 | 0.9677 | 0.9764 | 0.9779 |
| 0.0126 | 86.0 | 4300 | 0.0400 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9792 | 0.9904 | 0.9895 | 0.9878 | 0.0 | 0.9715 | 0.9681 | 0.9765 | 0.9778 |
| 0.0126 | 87.0 | 4350 | 0.0397 | 0.7788 | 0.9868 | 0.9865 | nan | 0.9789 | 0.9898 | 0.9898 | 0.9885 | 0.0 | 0.9715 | 0.9682 | 0.9765 | 0.9780 |
| 0.0125 | 88.0 | 4400 | 0.0398 | 0.7788 | 0.9868 | 0.9865 | nan | 0.9791 | 0.9903 | 0.9894 | 0.9883 | 0.0 | 0.9716 | 0.9681 | 0.9767 | 0.9779 |
| 0.0125 | 89.0 | 4450 | 0.0400 | 0.7787 | 0.9867 | 0.9865 | nan | 0.9795 | 0.9898 | 0.9902 | 0.9872 | 0.0 | 0.9716 | 0.9682 | 0.9763 | 0.9776 |
| 0.0125 | 90.0 | 4500 | 0.0397 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9788 | 0.9902 | 0.9893 | 0.9887 | 0.0 | 0.9716 | 0.9680 | 0.9765 | 0.9779 |
| 0.0125 | 91.0 | 4550 | 0.0400 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9790 | 0.9901 | 0.9903 | 0.9875 | 0.0 | 0.9715 | 0.9680 | 0.9762 | 0.9779 |
| 0.0125 | 92.0 | 4600 | 0.0392 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9790 | 0.9903 | 0.9898 | 0.9878 | 0.0 | 0.9716 | 0.9680 | 0.9765 | 0.9777 |
| 0.0125 | 93.0 | 4650 | 0.0403 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9791 | 0.9900 | 0.9905 | 0.9873 | 0.0 | 0.9716 | 0.9681 | 0.9763 | 0.9777 |
| 0.0123 | 94.0 | 4700 | 0.0396 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9797 | 0.9898 | 0.9903 | 0.9874 | 0.0 | 0.9718 | 0.9684 | 0.9764 | 0.9778 |
| 0.0123 | 95.0 | 4750 | 0.0405 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9790 | 0.9901 | 0.9903 | 0.9874 | 0.0 | 0.9715 | 0.9679 | 0.9764 | 0.9778 |
| 0.0122 | 96.0 | 4800 | 0.0394 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9793 | 0.9896 | 0.9898 | 0.9884 | 0.0 | 0.9717 | 0.9682 | 0.9764 | 0.9780 |
| 0.0122 | 97.0 | 4850 | 0.0396 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9790 | 0.9900 | 0.9900 | 0.9882 | 0.0 | 0.9716 | 0.9681 | 0.9766 | 0.9780 |
| 0.0122 | 98.0 | 4900 | 0.0399 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9797 | 0.9900 | 0.9904 | 0.9870 | 0.0 | 0.9718 | 0.9682 | 0.9764 | 0.9776 |
| 0.0122 | 99.0 | 4950 | 0.0394 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9793 | 0.9896 | 0.9897 | 0.9885 | 0.0 | 0.9717 | 0.9682 | 0.9766 | 0.9780 |
| 0.0122 | 100.0 | 5000 | 0.0383 | 0.7790 | 0.9868 | 0.9866 | nan | 0.9804 | 0.9899 | 0.9895 | 0.9876 | 0.0 | 0.9720 | 0.9686 | 0.9767 | 0.9777 |
| 0.0122 | 101.0 | 5050 | 0.0399 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9794 | 0.9904 | 0.9895 | 0.9877 | 0.0 | 0.9716 | 0.9680 | 0.9766 | 0.9779 |
| 0.0121 | 102.0 | 5100 | 0.0392 | 0.7790 | 0.9868 | 0.9866 | nan | 0.9796 | 0.9898 | 0.9889 | 0.9890 | 0.0 | 0.9718 | 0.9685 | 0.9767 | 0.9779 |
| 0.0121 | 103.0 | 5150 | 0.0393 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9788 | 0.9901 | 0.9900 | 0.9881 | 0.0 | 0.9715 | 0.9679 | 0.9765 | 0.9781 |
| 0.012 | 104.0 | 5200 | 0.0400 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9790 | 0.9894 | 0.9904 | 0.9881 | 0.0 | 0.9716 | 0.9682 | 0.9763 | 0.9779 |
| 0.012 | 105.0 | 5250 | 0.0393 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9796 | 0.9894 | 0.9904 | 0.9878 | 0.0 | 0.9718 | 0.9683 | 0.9764 | 0.9780 |
| 0.012 | 106.0 | 5300 | 0.0390 | 0.7789 | 0.9868 | 0.9866 | nan | 0.9794 | 0.9900 | 0.9890 | 0.9888 | 0.0 | 0.9719 | 0.9683 | 0.9766 | 0.9780 |
| 0.012 | 107.0 | 5350 | 0.0383 | 0.7790 | 0.9868 | 0.9866 | nan | 0.9801 | 0.9899 | 0.9903 | 0.9870 | 0.0 | 0.9721 | 0.9684 | 0.9765 | 0.9779 |
| 0.0119 | 108.0 | 5400 | 0.0380 | 0.7792 | 0.9870 | 0.9868 | nan | 0.9807 | 0.9892 | 0.9897 | 0.9883 | 0.0 | 0.9724 | 0.9690 | 0.9768 | 0.9780 |
| 0.0119 | 109.0 | 5450 | 0.0400 | 0.7787 | 0.9867 | 0.9864 | nan | 0.9786 | 0.9902 | 0.9902 | 0.9876 | 0.0 | 0.9714 | 0.9677 | 0.9764 | 0.9778 |
| 0.0119 | 110.0 | 5500 | 0.0385 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9801 | 0.9894 | 0.9891 | 0.9889 | 0.0 | 0.9721 | 0.9686 | 0.9768 | 0.9780 |
| 0.0119 | 111.0 | 5550 | 0.0385 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9798 | 0.9896 | 0.9902 | 0.9879 | 0.0 | 0.9719 | 0.9685 | 0.9767 | 0.9781 |
| 0.0118 | 112.0 | 5600 | 0.0377 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9798 | 0.9891 | 0.9897 | 0.9891 | 0.0 | 0.9722 | 0.9687 | 0.9766 | 0.9782 |
| 0.0118 | 113.0 | 5650 | 0.0388 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9794 | 0.9899 | 0.9904 | 0.9878 | 0.0 | 0.9719 | 0.9683 | 0.9767 | 0.9781 |
| 0.0118 | 114.0 | 5700 | 0.0391 | 0.7789 | 0.9868 | 0.9866 | nan | 0.9797 | 0.9891 | 0.9906 | 0.9880 | 0.0 | 0.9719 | 0.9683 | 0.9763 | 0.9781 |
| 0.0118 | 115.0 | 5750 | 0.0390 | 0.7789 | 0.9868 | 0.9866 | nan | 0.9796 | 0.9902 | 0.9899 | 0.9876 | 0.0 | 0.9719 | 0.9683 | 0.9766 | 0.9779 |
| 0.0118 | 116.0 | 5800 | 0.0390 | 0.7789 | 0.9868 | 0.9866 | nan | 0.9795 | 0.9899 | 0.9896 | 0.9882 | 0.0 | 0.9718 | 0.9682 | 0.9767 | 0.9779 |
| 0.0118 | 117.0 | 5850 | 0.0394 | 0.7788 | 0.9867 | 0.9865 | nan | 0.9791 | 0.9899 | 0.9896 | 0.9883 | 0.0 | 0.9717 | 0.9679 | 0.9765 | 0.9778 |
| 0.0117 | 118.0 | 5900 | 0.0386 | 0.7789 | 0.9868 | 0.9866 | nan | 0.9796 | 0.9898 | 0.9900 | 0.9879 | 0.0 | 0.9719 | 0.9682 | 0.9766 | 0.9779 |
| 0.0117 | 119.0 | 5950 | 0.0386 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9800 | 0.9895 | 0.9896 | 0.9885 | 0.0 | 0.9721 | 0.9686 | 0.9767 | 0.9781 |
| 0.0117 | 120.0 | 6000 | 0.0388 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9796 | 0.9899 | 0.9902 | 0.9878 | 0.0 | 0.9719 | 0.9684 | 0.9767 | 0.9781 |
| 0.0117 | 121.0 | 6050 | 0.0389 | 0.7790 | 0.9868 | 0.9866 | nan | 0.9800 | 0.9896 | 0.9894 | 0.9883 | 0.0 | 0.9721 | 0.9684 | 0.9767 | 0.9778 |
| 0.0116 | 122.0 | 6100 | 0.0384 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9796 | 0.9896 | 0.9897 | 0.9886 | 0.0 | 0.9720 | 0.9684 | 0.9767 | 0.9780 |
| 0.0116 | 123.0 | 6150 | 0.0386 | 0.7789 | 0.9868 | 0.9865 | nan | 0.9793 | 0.9899 | 0.9901 | 0.9879 | 0.0 | 0.9718 | 0.9680 | 0.9765 | 0.9781 |
| 0.0115 | 124.0 | 6200 | 0.0383 | 0.7792 | 0.9870 | 0.9867 | nan | 0.9802 | 0.9890 | 0.9900 | 0.9888 | 0.0 | 0.9722 | 0.9688 | 0.9767 | 0.9781 |
| 0.0115 | 125.0 | 6250 | 0.0381 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9796 | 0.9892 | 0.9900 | 0.9888 | 0.0 | 0.9721 | 0.9685 | 0.9766 | 0.9780 |
| 0.0115 | 126.0 | 6300 | 0.0383 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9797 | 0.9894 | 0.9894 | 0.9893 | 0.0 | 0.9720 | 0.9686 | 0.9767 | 0.9782 |
| 0.0115 | 127.0 | 6350 | 0.0384 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9797 | 0.9895 | 0.9901 | 0.9881 | 0.0 | 0.9719 | 0.9684 | 0.9766 | 0.9781 |
| 0.0115 | 128.0 | 6400 | 0.0377 | 0.7792 | 0.9870 | 0.9867 | nan | 0.9801 | 0.9891 | 0.9896 | 0.9891 | 0.0 | 0.9722 | 0.9688 | 0.9767 | 0.9781 |
| 0.0115 | 129.0 | 6450 | 0.0383 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9800 | 0.9898 | 0.9899 | 0.9880 | 0.0 | 0.9721 | 0.9685 | 0.9768 | 0.9782 |
| 0.0115 | 130.0 | 6500 | 0.0377 | 0.7791 | 0.9870 | 0.9867 | nan | 0.9797 | 0.9895 | 0.9901 | 0.9885 | 0.0 | 0.9723 | 0.9687 | 0.9767 | 0.9781 |
| 0.0115 | 131.0 | 6550 | 0.0380 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9800 | 0.9891 | 0.9897 | 0.9890 | 0.0 | 0.9722 | 0.9687 | 0.9767 | 0.9780 |
| 0.0114 | 132.0 | 6600 | 0.0377 | 0.7792 | 0.9870 | 0.9868 | nan | 0.9799 | 0.9893 | 0.9901 | 0.9887 | 0.0 | 0.9724 | 0.9689 | 0.9766 | 0.9782 |
| 0.0114 | 133.0 | 6650 | 0.0378 | 0.7792 | 0.9870 | 0.9867 | nan | 0.9801 | 0.9899 | 0.9897 | 0.9882 | 0.0 | 0.9722 | 0.9687 | 0.9769 | 0.9782 |
| 0.0114 | 134.0 | 6700 | 0.0379 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9801 | 0.9896 | 0.9902 | 0.9879 | 0.0 | 0.9723 | 0.9688 | 0.9767 | 0.9780 |
| 0.0114 | 135.0 | 6750 | 0.0374 | 0.7793 | 0.9870 | 0.9868 | nan | 0.9803 | 0.9894 | 0.9899 | 0.9884 | 0.0 | 0.9724 | 0.9690 | 0.9768 | 0.9782 |
| 0.0113 | 136.0 | 6800 | 0.0386 | 0.7790 | 0.9869 | 0.9866 | nan | 0.9796 | 0.9897 | 0.9903 | 0.9878 | 0.0 | 0.9720 | 0.9683 | 0.9766 | 0.9781 |
| 0.0113 | 137.0 | 6850 | 0.0378 | 0.9739 | 0.9870 | 0.9867 | nan | 0.9802 | 0.9895 | 0.9900 | 0.9880 | nan | 0.9724 | 0.9688 | 0.9766 | 0.9779 |
| 0.0114 | 138.0 | 6900 | 0.0378 | 0.9740 | 0.9870 | 0.9868 | nan | 0.9800 | 0.9895 | 0.9893 | 0.9893 | nan | 0.9722 | 0.9688 | 0.9769 | 0.9783 |
| 0.0114 | 139.0 | 6950 | 0.0380 | 0.7791 | 0.9869 | 0.9867 | nan | 0.9797 | 0.9896 | 0.9897 | 0.9888 | 0.0 | 0.9722 | 0.9685 | 0.9767 | 0.9782 |
| 0.0113 | 140.0 | 7000 | 0.0374 | 0.7793 | 0.9871 | 0.9868 | nan | 0.9803 | 0.9893 | 0.9899 | 0.9887 | 0.0 | 0.9725 | 0.9690 | 0.9768 | 0.9783 |
| 0.0113 | 141.0 | 7050 | 0.0378 | 0.7792 | 0.9870 | 0.9868 | nan | 0.9801 | 0.9894 | 0.9900 | 0.9886 | 0.0 | 0.9724 | 0.9689 | 0.9767 | 0.9781 |
| 0.0112 | 142.0 | 7100 | 0.0380 | 0.9740 | 0.9870 | 0.9868 | nan | 0.9801 | 0.9899 | 0.9897 | 0.9882 | nan | 0.9724 | 0.9687 | 0.9768 | 0.9782 |
| 0.0112 | 143.0 | 7150 | 0.0380 | 0.9740 | 0.9870 | 0.9868 | nan | 0.9800 | 0.9897 | 0.9899 | 0.9883 | nan | 0.9724 | 0.9688 | 0.9768 | 0.9781 |
| 0.0112 | 144.0 | 7200 | 0.0378 | 0.9741 | 0.9870 | 0.9868 | nan | 0.9802 | 0.9896 | 0.9897 | 0.9887 | nan | 0.9725 | 0.9690 | 0.9768 | 0.9781 |
| 0.0112 | 145.0 | 7250 | 0.0376 | 0.7793 | 0.9870 | 0.9868 | nan | 0.9806 | 0.9892 | 0.9903 | 0.9880 | 0.0 | 0.9726 | 0.9690 | 0.9767 | 0.9782 |
| 0.0112 | 146.0 | 7300 | 0.0380 | 0.7792 | 0.9870 | 0.9867 | nan | 0.9801 | 0.9899 | 0.9898 | 0.9880 | 0.0 | 0.9724 | 0.9687 | 0.9767 | 0.9780 |
| 0.0112 | 147.0 | 7350 | 0.0381 | 0.9740 | 0.9870 | 0.9867 | nan | 0.9800 | 0.9900 | 0.9899 | 0.9880 | nan | 0.9723 | 0.9687 | 0.9767 | 0.9781 |
| 0.0111 | 148.0 | 7400 | 0.0374 | 0.9742 | 0.9871 | 0.9868 | nan | 0.9805 | 0.9895 | 0.9900 | 0.9883 | nan | 0.9726 | 0.9690 | 0.9768 | 0.9782 |
| 0.0111 | 149.0 | 7450 | 0.0378 | 0.9740 | 0.9870 | 0.9868 | nan | 0.9801 | 0.9897 | 0.9902 | 0.9879 | nan | 0.9724 | 0.9687 | 0.9767 | 0.9781 |
| 0.0112 | 150.0 | 7500 | 0.0377 | 0.9741 | 0.9870 | 0.9868 | nan | 0.9800 | 0.9892 | 0.9897 | 0.9891 | nan | 0.9725 | 0.9690 | 0.9767 | 0.9781 |
| 0.0112 | 151.0 | 7550 | 0.0377 | 0.9742 | 0.9871 | 0.9868 | nan | 0.9802 | 0.9893 | 0.9895 | 0.9893 | nan | 0.9725 | 0.9691 | 0.9768 | 0.9782 |
| 0.0111 | 152.0 | 7600 | 0.0374 | 0.9741 | 0.9870 | 0.9868 | nan | 0.9804 | 0.9898 | 0.9898 | 0.9883 | nan | 0.9726 | 0.9690 | 0.9768 | 0.9782 |
| 0.0111 | 153.0 | 7650 | 0.0380 | 0.9740 | 0.9870 | 0.9868 | nan | 0.9800 | 0.9898 | 0.9897 | 0.9884 | nan | 0.9725 | 0.9688 | 0.9767 | 0.9781 |
| 0.0111 | 154.0 | 7700 | 0.0373 | 0.9742 | 0.9871 | 0.9869 | nan | 0.9805 | 0.9891 | 0.9901 | 0.9887 | nan | 0.9727 | 0.9692 | 0.9767 | 0.9782 |
| 0.0111 | 155.0 | 7750 | 0.0375 | 0.9742 | 0.9871 | 0.9868 | nan | 0.9804 | 0.9896 | 0.9893 | 0.9891 | nan | 0.9727 | 0.9692 | 0.9768 | 0.9781 |
| 0.0111 | 156.0 | 7800 | 0.0378 | 0.9741 | 0.9870 | 0.9868 | nan | 0.9801 | 0.9898 | 0.9897 | 0.9886 | nan | 0.9727 | 0.9689 | 0.9767 | 0.9781 |
| 0.0111 | 157.0 | 7850 | 0.0376 | 0.9742 | 0.9871 | 0.9868 | nan | 0.9805 | 0.9891 | 0.9902 | 0.9885 | nan | 0.9727 | 0.9691 | 0.9766 | 0.9782 |
| 0.0111 | 158.0 | 7900 | 0.0375 | 0.9742 | 0.9871 | 0.9868 | nan | 0.9804 | 0.9893 | 0.9899 | 0.9887 | nan | 0.9727 | 0.9691 | 0.9767 | 0.9782 |
| 0.0111 | 159.0 | 7950 | 0.0372 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9805 | 0.9892 | 0.9904 | 0.9884 | nan | 0.9728 | 0.9693 | 0.9766 | 0.9783 |
| 0.0111 | 160.0 | 8000 | 0.0367 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9809 | 0.9896 | 0.9898 | 0.9882 | nan | 0.9730 | 0.9693 | 0.9768 | 0.9782 |
| 0.0111 | 161.0 | 8050 | 0.0370 | 0.9744 | 0.9871 | 0.9869 | nan | 0.9808 | 0.9898 | 0.9894 | 0.9886 | nan | 0.9728 | 0.9693 | 0.9770 | 0.9783 |
| 0.0111 | 162.0 | 8100 | 0.0371 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9806 | 0.9892 | 0.9901 | 0.9885 | nan | 0.9729 | 0.9694 | 0.9767 | 0.9782 |
| 0.0111 | 163.0 | 8150 | 0.0372 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9807 | 0.9894 | 0.9901 | 0.9882 | nan | 0.9729 | 0.9694 | 0.9767 | 0.9781 |
| 0.011 | 164.0 | 8200 | 0.0373 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9805 | 0.9896 | 0.9894 | 0.9889 | nan | 0.9728 | 0.9693 | 0.9768 | 0.9781 |
| 0.011 | 165.0 | 8250 | 0.0371 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9808 | 0.9897 | 0.9898 | 0.9882 | nan | 0.9729 | 0.9694 | 0.9768 | 0.9783 |
| 0.011 | 166.0 | 8300 | 0.0372 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9806 | 0.9897 | 0.9898 | 0.9884 | nan | 0.9729 | 0.9693 | 0.9768 | 0.9781 |
| 0.011 | 167.0 | 8350 | 0.0373 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9804 | 0.9896 | 0.9900 | 0.9885 | nan | 0.9728 | 0.9692 | 0.9768 | 0.9783 |
| 0.011 | 168.0 | 8400 | 0.0369 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9807 | 0.9895 | 0.9899 | 0.9885 | nan | 0.9731 | 0.9695 | 0.9767 | 0.9782 |
| 0.011 | 169.0 | 8450 | 0.0375 | 0.9742 | 0.9871 | 0.9869 | nan | 0.9802 | 0.9897 | 0.9898 | 0.9886 | nan | 0.9727 | 0.9691 | 0.9768 | 0.9782 |
| 0.0109 | 170.0 | 8500 | 0.0363 | 0.9746 | 0.9873 | 0.9871 | nan | 0.9814 | 0.9892 | 0.9894 | 0.9891 | nan | 0.9734 | 0.9699 | 0.9769 | 0.9782 |
| 0.0109 | 171.0 | 8550 | 0.0371 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9805 | 0.9895 | 0.9900 | 0.9885 | nan | 0.9729 | 0.9693 | 0.9767 | 0.9782 |
| 0.011 | 172.0 | 8600 | 0.0371 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9807 | 0.9896 | 0.9898 | 0.9885 | nan | 0.9729 | 0.9693 | 0.9768 | 0.9782 |
| 0.011 | 173.0 | 8650 | 0.0373 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9807 | 0.9893 | 0.9901 | 0.9885 | nan | 0.9728 | 0.9694 | 0.9767 | 0.9783 |
| 0.0109 | 174.0 | 8700 | 0.0372 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9806 | 0.9894 | 0.9898 | 0.9889 | nan | 0.9729 | 0.9694 | 0.9768 | 0.9783 |
| 0.0109 | 175.0 | 8750 | 0.0373 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9806 | 0.9895 | 0.9899 | 0.9885 | nan | 0.9729 | 0.9694 | 0.9768 | 0.9782 |
| 0.0109 | 176.0 | 8800 | 0.0371 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9808 | 0.9894 | 0.9898 | 0.9886 | nan | 0.9730 | 0.9694 | 0.9768 | 0.9782 |
| 0.0109 | 177.0 | 8850 | 0.0370 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9808 | 0.9897 | 0.9896 | 0.9886 | nan | 0.9730 | 0.9695 | 0.9768 | 0.9782 |
| 0.0109 | 178.0 | 8900 | 0.0373 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9808 | 0.9895 | 0.9899 | 0.9885 | nan | 0.9729 | 0.9694 | 0.9768 | 0.9783 |
| 0.0109 | 179.0 | 8950 | 0.0372 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9806 | 0.9894 | 0.9897 | 0.9888 | nan | 0.9729 | 0.9694 | 0.9768 | 0.9782 |
| 0.0109 | 180.0 | 9000 | 0.0368 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9811 | 0.9894 | 0.9897 | 0.9885 | nan | 0.9731 | 0.9696 | 0.9768 | 0.9781 |
| 0.0109 | 181.0 | 9050 | 0.0371 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9807 | 0.9894 | 0.9900 | 0.9886 | nan | 0.9730 | 0.9694 | 0.9768 | 0.9783 |
| 0.0109 | 182.0 | 9100 | 0.0370 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9808 | 0.9894 | 0.9898 | 0.9887 | nan | 0.9730 | 0.9695 | 0.9768 | 0.9782 |
| 0.0109 | 183.0 | 9150 | 0.0368 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9810 | 0.9892 | 0.9901 | 0.9885 | nan | 0.9732 | 0.9697 | 0.9767 | 0.9782 |
| 0.0108 | 184.0 | 9200 | 0.0373 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9805 | 0.9896 | 0.9897 | 0.9887 | nan | 0.9729 | 0.9693 | 0.9767 | 0.9782 |
| 0.0108 | 185.0 | 9250 | 0.0371 | 0.9743 | 0.9872 | 0.9869 | nan | 0.9806 | 0.9895 | 0.9900 | 0.9885 | nan | 0.9730 | 0.9694 | 0.9767 | 0.9783 |
| 0.0108 | 186.0 | 9300 | 0.0371 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9808 | 0.9897 | 0.9896 | 0.9886 | nan | 0.9731 | 0.9695 | 0.9769 | 0.9782 |
| 0.0108 | 187.0 | 9350 | 0.0371 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9808 | 0.9895 | 0.9899 | 0.9886 | nan | 0.9731 | 0.9695 | 0.9768 | 0.9782 |
| 0.0108 | 188.0 | 9400 | 0.0370 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9808 | 0.9893 | 0.9900 | 0.9886 | nan | 0.9730 | 0.9695 | 0.9768 | 0.9782 |
| 0.0108 | 189.0 | 9450 | 0.0371 | 0.9743 | 0.9872 | 0.9869 | nan | 0.9807 | 0.9895 | 0.9901 | 0.9883 | nan | 0.9730 | 0.9694 | 0.9767 | 0.9782 |
| 0.0108 | 190.0 | 9500 | 0.0370 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9808 | 0.9896 | 0.9899 | 0.9885 | nan | 0.9731 | 0.9695 | 0.9768 | 0.9782 |
| 0.0108 | 191.0 | 9550 | 0.0373 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9806 | 0.9895 | 0.9899 | 0.9886 | nan | 0.9729 | 0.9693 | 0.9768 | 0.9782 |
| 0.0108 | 192.0 | 9600 | 0.0371 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9807 | 0.9894 | 0.9899 | 0.9887 | nan | 0.9730 | 0.9695 | 0.9768 | 0.9782 |
| 0.0108 | 193.0 | 9650 | 0.0374 | 0.9743 | 0.9871 | 0.9869 | nan | 0.9805 | 0.9898 | 0.9897 | 0.9886 | nan | 0.9729 | 0.9693 | 0.9768 | 0.9782 |
| 0.0108 | 194.0 | 9700 | 0.0371 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9807 | 0.9896 | 0.9899 | 0.9885 | nan | 0.9730 | 0.9695 | 0.9768 | 0.9783 |
| 0.0108 | 195.0 | 9750 | 0.0370 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9809 | 0.9896 | 0.9898 | 0.9885 | nan | 0.9731 | 0.9696 | 0.9768 | 0.9782 |
| 0.0108 | 196.0 | 9800 | 0.0370 | 0.9745 | 0.9872 | 0.9870 | nan | 0.9810 | 0.9894 | 0.9898 | 0.9887 | nan | 0.9732 | 0.9697 | 0.9768 | 0.9782 |
| 0.0108 | 197.0 | 9850 | 0.0371 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9809 | 0.9896 | 0.9897 | 0.9886 | nan | 0.9731 | 0.9695 | 0.9768 | 0.9782 |
| 0.0108 | 198.0 | 9900 | 0.0371 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9809 | 0.9896 | 0.9897 | 0.9886 | nan | 0.9731 | 0.9695 | 0.9768 | 0.9782 |
| 0.0108 | 199.0 | 9950 | 0.0371 | 0.9744 | 0.9872 | 0.9870 | nan | 0.9809 | 0.9895 | 0.9898 | 0.9886 | nan | 0.9731 | 0.9696 | 0.9768 | 0.9782 |
| 0.0108 | 200.0 | 10000 | 0.0372 | 0.9744 | 0.9872 | 0.9869 | nan | 0.9806 | 0.9893 | 0.9901 | 0.9887 | nan | 0.9730 | 0.9695 | 0.9767 | 0.9782 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"背景",
"未水化水泥颗粒",
"孔隙",
"氢氧化钙",
"其他水化物"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-coord
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the NICOPOI-9/morphpad_coord_hgo_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0306
- Mean Iou: 0.9858
- Mean Accuracy: 0.9928
- Overall Accuracy: 0.9928
- Accuracy 0-0: 0.9933
- Accuracy 0-90: 0.9937
- Accuracy 90-0: 0.9943
- Accuracy 90-90: 0.9898
- Iou 0-0: 0.9885
- Iou 0-90: 0.9850
- Iou 90-0: 0.9826
- Iou 90-90: 0.9872
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 80
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:------:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 1.2185 | 2.5445 | 4000 | 1.2349 | 0.2290 | 0.3745 | 0.3762 | 0.2785 | 0.4062 | 0.4936 | 0.3198 | 0.2085 | 0.2334 | 0.2525 | 0.2216 |
| 1.0978 | 5.0891 | 8000 | 1.1020 | 0.2905 | 0.4487 | 0.4508 | 0.3780 | 0.5302 | 0.5341 | 0.3524 | 0.2937 | 0.2870 | 0.2991 | 0.2822 |
| 0.9886 | 7.6336 | 12000 | 1.0139 | 0.3231 | 0.4871 | 0.4896 | 0.4154 | 0.4500 | 0.7245 | 0.3585 | 0.3291 | 0.3272 | 0.3266 | 0.3096 |
| 0.9358 | 10.1781 | 16000 | 0.9575 | 0.3517 | 0.5195 | 0.5215 | 0.3765 | 0.6411 | 0.5865 | 0.4740 | 0.3438 | 0.3539 | 0.3617 | 0.3473 |
| 0.8735 | 12.7226 | 20000 | 0.8853 | 0.4007 | 0.5704 | 0.5726 | 0.4998 | 0.5637 | 0.7536 | 0.4647 | 0.4109 | 0.3953 | 0.4055 | 0.3913 |
| 0.7186 | 15.2672 | 24000 | 0.6833 | 0.5558 | 0.7151 | 0.7141 | 0.7389 | 0.6650 | 0.6919 | 0.7647 | 0.5919 | 0.5261 | 0.5453 | 0.5598 |
| 0.6514 | 17.8117 | 28000 | 0.4379 | 0.7017 | 0.8243 | 0.8243 | 0.8344 | 0.8161 | 0.8279 | 0.8187 | 0.7198 | 0.6807 | 0.6933 | 0.7130 |
| 0.603 | 20.3562 | 32000 | 0.2900 | 0.7980 | 0.8879 | 0.8874 | 0.9117 | 0.8490 | 0.8888 | 0.9020 | 0.8160 | 0.7726 | 0.7893 | 0.8142 |
| 0.2448 | 22.9008 | 36000 | 0.2154 | 0.8496 | 0.9184 | 0.9185 | 0.9330 | 0.9179 | 0.9170 | 0.9058 | 0.8683 | 0.8329 | 0.8445 | 0.8527 |
| 0.2766 | 25.4453 | 40000 | 0.2004 | 0.8612 | 0.9254 | 0.9254 | 0.9487 | 0.9059 | 0.9381 | 0.9088 | 0.8717 | 0.8469 | 0.8635 | 0.8628 |
| 0.6278 | 27.9898 | 44000 | 0.1410 | 0.8976 | 0.9459 | 0.9459 | 0.9426 | 0.9377 | 0.9559 | 0.9474 | 0.9075 | 0.8863 | 0.8932 | 0.9034 |
| 0.1684 | 30.5344 | 48000 | 0.1163 | 0.9137 | 0.9549 | 0.9548 | 0.9595 | 0.9417 | 0.9579 | 0.9605 | 0.9245 | 0.9046 | 0.9069 | 0.9187 |
| 0.0638 | 33.0789 | 52000 | 0.0927 | 0.9338 | 0.9657 | 0.9657 | 0.9697 | 0.9589 | 0.9715 | 0.9627 | 0.9406 | 0.9291 | 0.9291 | 0.9363 |
| 0.0749 | 35.6234 | 56000 | 0.0836 | 0.9382 | 0.9680 | 0.9680 | 0.9714 | 0.9663 | 0.9680 | 0.9664 | 0.9449 | 0.9325 | 0.9339 | 0.9414 |
| 0.045 | 38.1679 | 60000 | 0.0624 | 0.9545 | 0.9767 | 0.9767 | 0.9787 | 0.9751 | 0.9763 | 0.9766 | 0.9587 | 0.9521 | 0.9499 | 0.9573 |
| 0.1278 | 40.7125 | 64000 | 0.0635 | 0.9546 | 0.9767 | 0.9767 | 0.9773 | 0.9743 | 0.9813 | 0.9737 | 0.9598 | 0.9521 | 0.9492 | 0.9572 |
| 0.0443 | 43.2570 | 68000 | 0.0598 | 0.9584 | 0.9787 | 0.9787 | 0.9815 | 0.9723 | 0.9858 | 0.9752 | 0.9624 | 0.9548 | 0.9548 | 0.9617 |
| 0.0337 | 45.8015 | 72000 | 0.0549 | 0.9622 | 0.9807 | 0.9807 | 0.9877 | 0.9804 | 0.9820 | 0.9726 | 0.9648 | 0.9587 | 0.9622 | 0.9632 |
| 0.0434 | 48.3461 | 76000 | 0.0539 | 0.9643 | 0.9816 | 0.9817 | 0.9793 | 0.9779 | 0.9913 | 0.9781 | 0.9691 | 0.9611 | 0.9565 | 0.9703 |
| 0.1576 | 50.8906 | 80000 | 0.0577 | 0.9656 | 0.9825 | 0.9825 | 0.9799 | 0.9822 | 0.9825 | 0.9856 | 0.9694 | 0.9634 | 0.9653 | 0.9645 |
| 0.025 | 53.4351 | 84000 | 0.0453 | 0.9724 | 0.9860 | 0.9860 | 0.9856 | 0.9884 | 0.9840 | 0.9858 | 0.9762 | 0.9698 | 0.9697 | 0.9739 |
| 0.0318 | 55.9796 | 88000 | 0.0401 | 0.9733 | 0.9865 | 0.9865 | 0.9884 | 0.9845 | 0.9865 | 0.9865 | 0.9766 | 0.9700 | 0.9714 | 0.9753 |
| 0.1355 | 58.5242 | 92000 | 0.0453 | 0.9764 | 0.9880 | 0.9880 | 0.9896 | 0.9874 | 0.9889 | 0.9861 | 0.9796 | 0.9742 | 0.9731 | 0.9786 |
| 0.0256 | 61.0687 | 96000 | 0.0359 | 0.9817 | 0.9907 | 0.9908 | 0.9902 | 0.9925 | 0.9902 | 0.9901 | 0.9846 | 0.9808 | 0.9783 | 0.9833 |
| 0.019 | 63.6132 | 100000 | 0.0320 | 0.9819 | 0.9908 | 0.9909 | 0.9914 | 0.9908 | 0.9936 | 0.9875 | 0.9838 | 0.9812 | 0.9787 | 0.9841 |
| 0.0713 | 66.1578 | 104000 | 0.0319 | 0.9827 | 0.9912 | 0.9912 | 0.9940 | 0.9922 | 0.9937 | 0.9847 | 0.9859 | 0.9812 | 0.9807 | 0.9828 |
| 0.1036 | 68.7023 | 108000 | 0.0369 | 0.9807 | 0.9902 | 0.9903 | 0.9932 | 0.9916 | 0.9946 | 0.9813 | 0.9844 | 0.9807 | 0.9790 | 0.9788 |
| 0.0575 | 71.2468 | 112000 | 0.0338 | 0.9843 | 0.9921 | 0.9921 | 0.9939 | 0.9913 | 0.9929 | 0.9901 | 0.9870 | 0.9822 | 0.9814 | 0.9867 |
| 0.0136 | 73.7913 | 116000 | 0.0259 | 0.9870 | 0.9934 | 0.9934 | 0.9926 | 0.9936 | 0.9946 | 0.9930 | 0.9889 | 0.9852 | 0.9850 | 0.9891 |
| 0.045 | 76.3359 | 120000 | 0.0310 | 0.9844 | 0.9921 | 0.9921 | 0.9913 | 0.9926 | 0.9941 | 0.9902 | 0.9866 | 0.9834 | 0.9805 | 0.9871 |
| 0.6665 | 78.8804 | 124000 | 0.0306 | 0.9858 | 0.9928 | 0.9928 | 0.9933 | 0.9937 | 0.9943 | 0.9898 | 0.9885 | 0.9850 | 0.9826 | 0.9872 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
jgerbscheid/segformer_b1-nlver_finetuned-1024-1024 |
# NL-aerial segmentation
This is a segformer segmentation model finetuned on random samples from the entire 41,000 square kilometer of aerial photography data, see [pdok aerial data](https://www.pdok.nl/introductie/-/article/pdok-luchtfoto-rgb-open-), and using the BGT, see [pdok BGT data](https://www.pdok.nl/introductie/-/article/basisregistratie-grootschalige-topografie-bgt-).
Specifically it takes in 1024x1024 aerial photographs taken at a resolution of 8cm/pixel and predicts water, buildings, roads/pavement, vegetation.
This model is part of the NL-veranderdetectie project, [summary](https://www.winnovatie.nl/kennisbank/3041030.aspx?t=Learning-paper-NL-Veranderdetectie-Fase-1) in dutch.
The model was trained using this [codebase](https://gitlab.com/hetwaterschapshuis/kenniscentrum/tooling/nlveranderdetectie/), see the repo for more details.
## Model Details
Regular segformer, with the classification head adjusted to 5 classes.
### Model Description
- **Developed by**: Het [Waterschapshuis](https://www.hetwaterschapshuis.nl/), in collaboration with the Dutch Waterboards [More Information Needed]
<!-- - **Funded by [optional]:** [] -->
<!-- - **Shared by [optional]:** [More Information Needed] -->
- **Model type:** [Segformer Semantic Segmentation Model]
- **License:** [MIT]
- **Finetuned from model:** [nvidia/segformer-b1-finetuned-cityscapes-1024-1024]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [https://gitlab.com/hetwaterschapshuis/kenniscentrum/tooling/nlveranderdetectie/](https://gitlab.com/hetwaterschapshuis/kenniscentrum/tooling/nlveranderdetectie/)
<!-- - **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed] -->
## Uses
This model is used to create a segmentation map of the dutch waterways based on aerial imagery. This segmentation map is then used to actualize the BGT, a national map maintained by the Dutch government.
#### Training Hyperparameters
See the nlveranderdetectie repostiory for details of the training setup. [https://gitlab.com/hetwaterschapshuis/kenniscentrum/tooling/nlveranderdetectie/](https://gitlab.com/hetwaterschapshuis/kenniscentrum/tooling/nlveranderdetectie/) , specifically look at the yaml configs in the run/configs directory.
### Hardware
The model was trained on a single nvidia 3090 GPU for ~2 days or ~126k forward passes with a batch size of 8.
## Model Card Authors
The model car was written by J. Gerbscheid, email: [email protected]
## Model Card Contact
The model was trained by J. Gerbscheid, email: [email protected]
| [
"background",
"waterdeel",
"pand",
"wegdeel",
"overbruggingsdeel",
"vegetatie"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord-v1 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-coord-v1
This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the NICOPOI-9/morphpad_coord_hgo_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0644
- Mean Iou: 0.9579
- Mean Accuracy: 0.9785
- Overall Accuracy: 0.9785
- Accuracy 0-0: 0.9792
- Accuracy 0-90: 0.9782
- Accuracy 90-0: 0.9762
- Accuracy 90-90: 0.9804
- Iou 0-0: 0.9634
- Iou 0-90: 0.9512
- Iou 90-0: 0.9543
- Iou 90-90: 0.9627
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 1.1962 | 2.5445 | 4000 | 1.2063 | 0.2464 | 0.3994 | 0.4010 | 0.2991 | 0.2803 | 0.5096 | 0.5084 | 0.2367 | 0.2111 | 0.2658 | 0.2721 |
| 1.051 | 5.0891 | 8000 | 1.0734 | 0.3118 | 0.4765 | 0.4765 | 0.3856 | 0.5520 | 0.3937 | 0.5745 | 0.3123 | 0.3148 | 0.2968 | 0.3233 |
| 0.9309 | 7.6336 | 12000 | 0.9672 | 0.3612 | 0.5314 | 0.5323 | 0.4806 | 0.5216 | 0.7075 | 0.4158 | 0.3362 | 0.3706 | 0.3778 | 0.3603 |
| 0.8041 | 10.1781 | 16000 | 0.8444 | 0.4475 | 0.6180 | 0.6178 | 0.6131 | 0.6672 | 0.6120 | 0.5798 | 0.4403 | 0.4360 | 0.4543 | 0.4593 |
| 0.6617 | 12.7226 | 20000 | 0.7405 | 0.5039 | 0.6697 | 0.6700 | 0.6310 | 0.6588 | 0.6714 | 0.7177 | 0.5097 | 0.4912 | 0.5114 | 0.5033 |
| 0.54 | 15.2672 | 24000 | 0.6090 | 0.5828 | 0.7360 | 0.7362 | 0.6931 | 0.7532 | 0.7427 | 0.7550 | 0.5911 | 0.5709 | 0.5876 | 0.5819 |
| 0.7378 | 17.8117 | 28000 | 0.3740 | 0.7401 | 0.8507 | 0.8505 | 0.8789 | 0.8270 | 0.8186 | 0.8783 | 0.7712 | 0.7324 | 0.7203 | 0.7366 |
| 0.58 | 20.3562 | 32000 | 0.1892 | 0.8644 | 0.9272 | 0.9272 | 0.9329 | 0.9188 | 0.9142 | 0.9430 | 0.8810 | 0.8523 | 0.8539 | 0.8704 |
| 0.1305 | 22.9008 | 36000 | 0.1473 | 0.8945 | 0.9443 | 0.9443 | 0.9563 | 0.9245 | 0.9421 | 0.9542 | 0.9021 | 0.8783 | 0.8925 | 0.9049 |
| 0.1775 | 25.4453 | 40000 | 0.1133 | 0.9178 | 0.9571 | 0.9571 | 0.9578 | 0.9536 | 0.9583 | 0.9586 | 0.9264 | 0.9068 | 0.9130 | 0.9249 |
| 0.4792 | 27.9898 | 44000 | 0.0961 | 0.9306 | 0.9640 | 0.9640 | 0.9662 | 0.9633 | 0.9617 | 0.9650 | 0.9374 | 0.9194 | 0.9268 | 0.9387 |
| 0.1084 | 30.5344 | 48000 | 0.0886 | 0.9364 | 0.9671 | 0.9672 | 0.9684 | 0.9600 | 0.9689 | 0.9712 | 0.9429 | 0.9257 | 0.9335 | 0.9437 |
| 0.0471 | 33.0789 | 52000 | 0.0721 | 0.9485 | 0.9735 | 0.9735 | 0.9772 | 0.9674 | 0.9729 | 0.9767 | 0.9528 | 0.9402 | 0.9467 | 0.9542 |
| 0.0722 | 35.6234 | 56000 | 0.0646 | 0.9554 | 0.9772 | 0.9772 | 0.9809 | 0.9728 | 0.9757 | 0.9794 | 0.9576 | 0.9488 | 0.9522 | 0.9629 |
| 0.0406 | 38.1679 | 60000 | 0.0644 | 0.9579 | 0.9785 | 0.9785 | 0.9792 | 0.9782 | 0.9762 | 0.9804 | 0.9634 | 0.9512 | 0.9543 | 0.9627 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord-v2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-coord-v2
This model is a fine-tuned version of [NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord-v1](https://huggingface.co/NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord-v1) on the NICOPOI-9/morphpad_coord_hgo_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0408
- Mean Iou: 0.9952
- Mean Accuracy: 0.9976
- Overall Accuracy: 0.9976
- Accuracy 0-0: 0.9993
- Accuracy 0-90: 0.9958
- Accuracy 90-0: 0.9969
- Accuracy 90-90: 0.9983
- Iou 0-0: 0.9975
- Iou 0-90: 0.9929
- Iou 90-0: 0.9949
- Iou 90-90: 0.9955
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 60
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 0.0654 | 2.5445 | 4000 | 0.1134 | 0.9236 | 0.9604 | 0.9602 | 0.9729 | 0.9373 | 0.9642 | 0.9674 | 0.9265 | 0.9144 | 0.9233 | 0.9301 |
| 0.0552 | 5.0891 | 8000 | 0.1426 | 0.9161 | 0.9562 | 0.9561 | 0.9607 | 0.9538 | 0.9547 | 0.9555 | 0.9166 | 0.9146 | 0.9112 | 0.9218 |
| 0.0469 | 7.6336 | 12000 | 0.0633 | 0.9556 | 0.9774 | 0.9773 | 0.9811 | 0.9714 | 0.9744 | 0.9826 | 0.9588 | 0.9516 | 0.9545 | 0.9576 |
| 0.0378 | 10.1781 | 16000 | 0.0506 | 0.9650 | 0.9822 | 0.9822 | 0.9826 | 0.9773 | 0.9844 | 0.9844 | 0.9661 | 0.9601 | 0.9643 | 0.9696 |
| 0.0582 | 12.7226 | 20000 | 0.0402 | 0.9737 | 0.9867 | 0.9866 | 0.9925 | 0.9891 | 0.9791 | 0.9860 | 0.9774 | 0.9700 | 0.9699 | 0.9774 |
| 0.0322 | 15.2672 | 24000 | 0.0453 | 0.9707 | 0.9850 | 0.9851 | 0.9809 | 0.9843 | 0.9909 | 0.9840 | 0.9746 | 0.9715 | 0.9637 | 0.9728 |
| 0.0254 | 17.8117 | 28000 | 0.1030 | 0.9652 | 0.9823 | 0.9822 | 0.9895 | 0.9808 | 0.9748 | 0.9841 | 0.9761 | 0.9599 | 0.9583 | 0.9666 |
| 2.3028 | 20.3562 | 32000 | 0.0572 | 0.9745 | 0.9871 | 0.9870 | 0.9861 | 0.9839 | 0.9885 | 0.9896 | 0.9789 | 0.9717 | 0.9700 | 0.9773 |
| 0.0769 | 22.9008 | 36000 | 0.0225 | 0.9866 | 0.9932 | 0.9932 | 0.9960 | 0.9899 | 0.9939 | 0.9932 | 0.9893 | 0.9837 | 0.9849 | 0.9884 |
| 0.0512 | 25.4453 | 40000 | 0.0329 | 0.9850 | 0.9924 | 0.9924 | 0.9959 | 0.9867 | 0.9954 | 0.9917 | 0.9857 | 0.9820 | 0.9843 | 0.9878 |
| 0.3281 | 27.9898 | 44000 | 0.0301 | 0.9866 | 0.9933 | 0.9932 | 0.9958 | 0.9913 | 0.9907 | 0.9952 | 0.9899 | 0.9858 | 0.9843 | 0.9863 |
| 0.1536 | 30.5344 | 48000 | 0.0355 | 0.9889 | 0.9944 | 0.9944 | 0.9981 | 0.9927 | 0.9920 | 0.9949 | 0.9941 | 0.9855 | 0.9880 | 0.9880 |
| 0.0079 | 33.0789 | 52000 | 0.0256 | 0.9933 | 0.9966 | 0.9966 | 0.9979 | 0.9951 | 0.9961 | 0.9974 | 0.9956 | 0.9917 | 0.9934 | 0.9924 |
| 0.0074 | 35.6234 | 56000 | 0.0205 | 0.9938 | 0.9969 | 0.9969 | 0.9983 | 0.9970 | 0.9966 | 0.9956 | 0.9963 | 0.9923 | 0.9928 | 0.9939 |
| 0.0077 | 38.1679 | 60000 | 0.0255 | 0.9933 | 0.9967 | 0.9966 | 0.9985 | 0.9946 | 0.9964 | 0.9971 | 0.9954 | 0.9925 | 0.9919 | 0.9934 |
| 0.0061 | 40.7125 | 64000 | 0.0282 | 0.9945 | 0.9972 | 0.9972 | 0.9987 | 0.9958 | 0.9974 | 0.9969 | 0.9967 | 0.9916 | 0.9950 | 0.9945 |
| 0.0051 | 43.2570 | 68000 | 0.0262 | 0.9937 | 0.9969 | 0.9968 | 0.9987 | 0.9949 | 0.9959 | 0.9979 | 0.9968 | 0.9916 | 0.9934 | 0.9930 |
| 0.0047 | 45.8015 | 72000 | 0.0564 | 0.9912 | 0.9956 | 0.9956 | 0.9991 | 0.9950 | 0.9940 | 0.9943 | 0.9958 | 0.9882 | 0.9897 | 0.9912 |
| 0.0046 | 48.3461 | 76000 | 0.0492 | 0.9939 | 0.9969 | 0.9969 | 0.9992 | 0.9941 | 0.9974 | 0.9970 | 0.9969 | 0.9903 | 0.9938 | 0.9945 |
| 0.0552 | 50.8906 | 80000 | 0.0438 | 0.9948 | 0.9974 | 0.9974 | 0.9992 | 0.9966 | 0.9967 | 0.9972 | 0.9980 | 0.9924 | 0.9948 | 0.9941 |
| 0.0039 | 53.4351 | 84000 | 0.0361 | 0.9953 | 0.9976 | 0.9976 | 0.9991 | 0.9961 | 0.9973 | 0.9981 | 0.9975 | 0.9928 | 0.9952 | 0.9956 |
| 0.0034 | 55.9796 | 88000 | 0.0317 | 0.9958 | 0.9979 | 0.9979 | 0.9993 | 0.9964 | 0.9974 | 0.9985 | 0.9979 | 0.9937 | 0.9955 | 0.9963 |
| 0.0149 | 58.5242 | 92000 | 0.0408 | 0.9952 | 0.9976 | 0.9976 | 0.9993 | 0.9958 | 0.9969 | 0.9983 | 0.9975 | 0.9929 | 0.9949 | 0.9955 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord-v3 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-coord-v3
This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the NICOPOI-9/morphpad_coord_hgo_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0298
- Mean Iou: 0.9952
- Mean Accuracy: 0.9976
- Overall Accuracy: 0.9976
- Accuracy 0-0: 0.9983
- Accuracy 0-90: 0.9969
- Accuracy 90-0: 0.9980
- Accuracy 90-90: 0.9971
- Iou 0-0: 0.9959
- Iou 0-90: 0.9950
- Iou 90-0: 0.9945
- Iou 90-90: 0.9954
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 60
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 0.9708 | 2.6525 | 4000 | 0.9827 | 0.3521 | 0.5133 | 0.5141 | 0.6031 | 0.4019 | 0.6068 | 0.4413 | 0.4004 | 0.3106 | 0.2982 | 0.3990 |
| 1.8598 | 5.3050 | 8000 | 0.6389 | 0.5316 | 0.6914 | 0.6913 | 0.6875 | 0.7105 | 0.5770 | 0.7908 | 0.6004 | 0.4655 | 0.5143 | 0.5463 |
| 0.4633 | 7.9576 | 12000 | 0.5041 | 0.6369 | 0.7767 | 0.7768 | 0.7883 | 0.7582 | 0.7869 | 0.7735 | 0.6734 | 0.6126 | 0.5897 | 0.6720 |
| 0.5576 | 10.6101 | 16000 | 0.4221 | 0.6949 | 0.8191 | 0.8194 | 0.8477 | 0.7814 | 0.8269 | 0.8206 | 0.7084 | 0.7075 | 0.6533 | 0.7106 |
| 0.3003 | 13.2626 | 20000 | 0.3963 | 0.7293 | 0.8413 | 0.8414 | 0.8524 | 0.8809 | 0.8047 | 0.8273 | 0.7565 | 0.6617 | 0.7326 | 0.7666 |
| 0.3174 | 15.9151 | 24000 | 0.4310 | 0.7511 | 0.8573 | 0.8572 | 0.8538 | 0.8774 | 0.8402 | 0.8577 | 0.7779 | 0.7125 | 0.7412 | 0.7729 |
| 0.2307 | 18.5676 | 28000 | 0.3326 | 0.8024 | 0.8901 | 0.8902 | 0.9029 | 0.8852 | 0.8832 | 0.8893 | 0.8196 | 0.7874 | 0.7805 | 0.8221 |
| 0.1946 | 21.2202 | 32000 | 0.2625 | 0.8409 | 0.9134 | 0.9134 | 0.9136 | 0.9197 | 0.8974 | 0.9230 | 0.8533 | 0.8114 | 0.8410 | 0.8581 |
| 0.1325 | 23.8727 | 36000 | 0.1298 | 0.9185 | 0.9575 | 0.9576 | 0.9647 | 0.9488 | 0.9691 | 0.9474 | 0.9319 | 0.9114 | 0.9158 | 0.9150 |
| 0.1007 | 26.5252 | 40000 | 0.0752 | 0.9596 | 0.9794 | 0.9794 | 0.9824 | 0.9811 | 0.9748 | 0.9792 | 0.9671 | 0.9557 | 0.9532 | 0.9624 |
| 0.0336 | 29.1777 | 44000 | 0.2640 | 0.9435 | 0.9709 | 0.9709 | 0.9750 | 0.9668 | 0.9713 | 0.9706 | 0.9473 | 0.9389 | 0.9382 | 0.9497 |
| 0.0182 | 31.8302 | 48000 | 0.1066 | 0.9680 | 0.9837 | 0.9837 | 0.9882 | 0.9801 | 0.9845 | 0.9821 | 0.9712 | 0.9677 | 0.9627 | 0.9703 |
| 0.0141 | 34.4828 | 52000 | 0.0716 | 0.9806 | 0.9902 | 0.9902 | 0.9880 | 0.9898 | 0.9893 | 0.9937 | 0.9822 | 0.9789 | 0.9792 | 0.9821 |
| 0.0117 | 37.1353 | 56000 | 0.0705 | 0.9850 | 0.9925 | 0.9925 | 0.9914 | 0.9920 | 0.9929 | 0.9935 | 0.9861 | 0.9838 | 0.9842 | 0.9859 |
| 0.0129 | 39.7878 | 60000 | 0.0932 | 0.9833 | 0.9916 | 0.9916 | 0.9897 | 0.9930 | 0.9920 | 0.9916 | 0.9834 | 0.9823 | 0.9827 | 0.9846 |
| 0.0101 | 42.4403 | 64000 | 0.0302 | 0.9924 | 0.9962 | 0.9962 | 0.9964 | 0.9966 | 0.9972 | 0.9946 | 0.9938 | 0.9917 | 0.9921 | 0.9921 |
| 0.0071 | 45.0928 | 68000 | 0.0294 | 0.9933 | 0.9966 | 0.9966 | 0.9971 | 0.9961 | 0.9958 | 0.9976 | 0.9946 | 0.9931 | 0.9925 | 0.9930 |
| 0.0066 | 47.7454 | 72000 | 0.0637 | 0.9912 | 0.9956 | 0.9956 | 0.9959 | 0.9957 | 0.9962 | 0.9946 | 0.9927 | 0.9917 | 0.9895 | 0.9910 |
| 0.0083 | 50.3979 | 76000 | 0.0494 | 0.9901 | 0.9950 | 0.9950 | 0.9956 | 0.9951 | 0.9946 | 0.9948 | 0.9901 | 0.9899 | 0.9887 | 0.9917 |
| 0.0049 | 53.0504 | 80000 | 0.0286 | 0.9944 | 0.9972 | 0.9972 | 0.9982 | 0.9967 | 0.9976 | 0.9963 | 0.9956 | 0.9935 | 0.9939 | 0.9946 |
| 0.0104 | 55.7029 | 84000 | 0.0244 | 0.9953 | 0.9976 | 0.9976 | 0.9984 | 0.9971 | 0.9978 | 0.9971 | 0.9957 | 0.9953 | 0.9942 | 0.9959 |
| 0.0046 | 58.3554 | 88000 | 0.0298 | 0.9952 | 0.9976 | 0.9976 | 0.9983 | 0.9969 | 0.9980 | 0.9971 | 0.9959 | 0.9950 | 0.9945 | 0.9954 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord-v3_1 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-coord-v3_1
This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0117
- Mean Iou: 0.9981
- Mean Accuracy: 0.9990
- Overall Accuracy: 0.9990
- Accuracy 0-0: 0.9995
- Accuracy 0-90: 0.9985
- Accuracy 90-0: 0.9988
- Accuracy 90-90: 0.9993
- Iou 0-0: 0.9991
- Iou 0-90: 0.9979
- Iou 90-0: 0.9976
- Iou 90-90: 0.9978
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 60
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 0.903 | 2.6525 | 4000 | 0.8952 | 0.3916 | 0.5570 | 0.5567 | 0.5335 | 0.6125 | 0.4890 | 0.5929 | 0.4534 | 0.3418 | 0.3411 | 0.4299 |
| 0.6373 | 5.3050 | 8000 | 0.5078 | 0.6237 | 0.7643 | 0.7643 | 0.7676 | 0.8339 | 0.6741 | 0.7817 | 0.6758 | 0.5472 | 0.6022 | 0.6698 |
| 0.2851 | 7.9576 | 12000 | 0.2955 | 0.7612 | 0.8642 | 0.8642 | 0.8669 | 0.8687 | 0.8339 | 0.8874 | 0.7959 | 0.7358 | 0.7500 | 0.7631 |
| 0.2309 | 10.6101 | 16000 | 0.1305 | 0.9184 | 0.9574 | 0.9574 | 0.9575 | 0.9381 | 0.9648 | 0.9692 | 0.9333 | 0.9074 | 0.8991 | 0.9337 |
| 0.0907 | 13.2626 | 20000 | 0.1249 | 0.9267 | 0.9620 | 0.9620 | 0.9636 | 0.9541 | 0.9594 | 0.9708 | 0.9379 | 0.9205 | 0.9169 | 0.9316 |
| 0.3051 | 15.9151 | 24000 | 0.0529 | 0.9675 | 0.9835 | 0.9835 | 0.9842 | 0.9805 | 0.9839 | 0.9854 | 0.9712 | 0.9636 | 0.9626 | 0.9728 |
| 0.0659 | 18.5676 | 28000 | 0.0630 | 0.9670 | 0.9832 | 0.9833 | 0.9852 | 0.9747 | 0.9885 | 0.9846 | 0.9719 | 0.9642 | 0.9633 | 0.9687 |
| 0.0474 | 21.2202 | 32000 | 0.0454 | 0.9768 | 0.9882 | 0.9883 | 0.9910 | 0.9856 | 0.9865 | 0.9899 | 0.9783 | 0.9737 | 0.9747 | 0.9805 |
| 0.0449 | 23.8727 | 36000 | 0.0468 | 0.9795 | 0.9896 | 0.9896 | 0.9900 | 0.9812 | 0.9900 | 0.9973 | 0.9828 | 0.9743 | 0.9783 | 0.9824 |
| 0.0552 | 26.5252 | 40000 | 0.0266 | 0.9884 | 0.9942 | 0.9942 | 0.9949 | 0.9917 | 0.9947 | 0.9953 | 0.9888 | 0.9865 | 0.9866 | 0.9916 |
| 0.0541 | 29.1777 | 44000 | 0.0290 | 0.9908 | 0.9954 | 0.9954 | 0.9951 | 0.9951 | 0.9967 | 0.9946 | 0.9921 | 0.9897 | 0.9905 | 0.9909 |
| 0.0082 | 31.8302 | 48000 | 0.0421 | 0.9891 | 0.9945 | 0.9945 | 0.9940 | 0.9924 | 0.9951 | 0.9966 | 0.9908 | 0.9869 | 0.9884 | 0.9904 |
| 0.0061 | 34.4828 | 52000 | 0.0345 | 0.9923 | 0.9961 | 0.9961 | 0.9971 | 0.9941 | 0.9966 | 0.9966 | 0.9939 | 0.9912 | 0.9916 | 0.9922 |
| 0.0053 | 37.1353 | 56000 | 0.0256 | 0.9941 | 0.9970 | 0.9970 | 0.9976 | 0.9972 | 0.9966 | 0.9968 | 0.9957 | 0.9928 | 0.9929 | 0.9949 |
| 0.0045 | 39.7878 | 60000 | 0.0256 | 0.9937 | 0.9968 | 0.9968 | 0.9978 | 0.9959 | 0.9959 | 0.9978 | 0.9937 | 0.9927 | 0.9926 | 0.9957 |
| 0.0046 | 42.4403 | 64000 | 0.0171 | 0.9964 | 0.9982 | 0.9982 | 0.9983 | 0.9976 | 0.9987 | 0.9981 | 0.9972 | 0.9958 | 0.9955 | 0.9969 |
| 0.0032 | 45.0928 | 68000 | 0.0293 | 0.9957 | 0.9979 | 0.9979 | 0.9983 | 0.9969 | 0.9975 | 0.9988 | 0.9966 | 0.9950 | 0.9950 | 0.9964 |
| 0.003 | 47.7454 | 72000 | 0.0251 | 0.9964 | 0.9982 | 0.9982 | 0.9984 | 0.9973 | 0.9984 | 0.9987 | 0.9973 | 0.9952 | 0.9965 | 0.9966 |
| 0.0035 | 50.3979 | 76000 | 0.0245 | 0.9973 | 0.9986 | 0.9986 | 0.9993 | 0.9982 | 0.9983 | 0.9987 | 0.9982 | 0.9969 | 0.9963 | 0.9977 |
| 0.0025 | 53.0504 | 80000 | 0.0222 | 0.9972 | 0.9986 | 0.9986 | 0.9990 | 0.9980 | 0.9987 | 0.9986 | 0.9985 | 0.9965 | 0.9970 | 0.9968 |
| 0.0023 | 55.7029 | 84000 | 0.0104 | 0.9982 | 0.9991 | 0.9991 | 0.9994 | 0.9989 | 0.9987 | 0.9993 | 0.9988 | 0.9980 | 0.9975 | 0.9983 |
| 0.0022 | 58.3554 | 88000 | 0.0117 | 0.9981 | 0.9990 | 0.9990 | 0.9995 | 0.9985 | 0.9988 | 0.9993 | 0.9991 | 0.9979 | 0.9976 | 0.9978 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
Dnq2025/swin-large-mask2former-finetuned-ER-Mito-LD8 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-large-mask2former-finetuned-ER-Mito-LD8
This model is a fine-tuned version of [facebook/mask2former-swin-large-ade-semantic](https://huggingface.co/facebook/mask2former-swin-large-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
It achieves the following results on the evaluation set:
- Mean Iou: 0.5775
- Loss: 34.1770
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 6450
### Training results
| Training Loss | Epoch | Step | Mean Iou | Validation Loss |
|:-------------:|:-----:|:----:|:--------:|:---------------:|
| 50.2838 | 1.0 | 129 | 0.3320 | 40.2146 |
| 38.6838 | 2.0 | 258 | 0.4777 | 33.3521 |
| 35.7266 | 3.0 | 387 | 0.5179 | 30.3853 |
| 28.8023 | 4.0 | 516 | 0.5172 | 31.1722 |
| 27.499 | 5.0 | 645 | 0.5405 | 29.2993 |
| 26.443 | 6.0 | 774 | 0.6063 | 27.7216 |
| 24.4565 | 7.0 | 903 | 0.5585 | 27.0999 |
| 23.7098 | 8.0 | 1032 | 0.5561 | 27.6352 |
| 22.5123 | 9.0 | 1161 | 0.5488 | 26.2882 |
| 21.6224 | 10.0 | 1290 | 0.5360 | 28.5500 |
| 21.1611 | 11.0 | 1419 | 0.5107 | 26.7347 |
| 20.0678 | 12.0 | 1548 | 0.5705 | 25.9258 |
| 19.8926 | 13.0 | 1677 | 0.5776 | 26.3917 |
| 18.5645 | 14.0 | 1806 | 0.5857 | 25.5181 |
| 18.504 | 15.0 | 1935 | 0.5701 | 26.0597 |
| 17.6968 | 16.0 | 2064 | 0.6108 | 25.1246 |
| 17.555 | 17.0 | 2193 | 0.6117 | 25.6074 |
| 17.2567 | 18.0 | 2322 | 0.5683 | 27.1555 |
| 16.0851 | 19.0 | 2451 | 0.6045 | 27.6046 |
| 16.308 | 20.0 | 2580 | 0.5550 | 28.1746 |
| 15.7719 | 21.0 | 2709 | 0.5898 | 25.3221 |
| 15.0966 | 22.0 | 2838 | 0.6299 | 27.0200 |
| 15.2529 | 23.0 | 2967 | 0.5870 | 29.0526 |
| 15.2963 | 24.0 | 3096 | 0.5638 | 27.0797 |
| 14.5228 | 25.0 | 3225 | 0.6203 | 27.8585 |
| 14.2121 | 26.0 | 3354 | 0.5659 | 28.6089 |
| 13.909 | 27.0 | 3483 | 0.6042 | 28.4436 |
| 14.0334 | 28.0 | 3612 | 0.6067 | 29.2367 |
| 13.3485 | 29.0 | 3741 | 0.5800 | 28.8674 |
| 13.4275 | 30.0 | 3870 | 0.6036 | 28.3902 |
| 13.1812 | 31.0 | 3999 | 0.5837 | 30.3429 |
| 13.0124 | 32.0 | 4128 | 0.5837 | 28.6284 |
| 12.4116 | 33.0 | 4257 | 0.5851 | 30.4995 |
| 13.3998 | 34.0 | 4386 | 0.5749 | 31.7624 |
| 12.794 | 35.0 | 4515 | 0.5840 | 29.0796 |
| 12.2829 | 36.0 | 4644 | 0.5594 | 30.8203 |
| 12.204 | 37.0 | 4773 | 0.6036 | 28.8408 |
| 12.6922 | 38.0 | 4902 | 0.5848 | 30.4332 |
| 12.1068 | 39.0 | 5031 | 0.5988 | 29.9606 |
| 11.7072 | 40.0 | 5160 | 0.5681 | 31.9938 |
| 11.7888 | 41.0 | 5289 | 0.5834 | 30.7203 |
| 11.5609 | 42.0 | 5418 | 0.5843 | 30.2104 |
| 11.4152 | 43.0 | 5547 | 0.6122 | 31.5076 |
| 11.932 | 44.0 | 5676 | 0.6020 | 31.8253 |
| 11.3475 | 45.0 | 5805 | 0.5828 | 32.4827 |
| 10.6893 | 46.0 | 5934 | 0.5792 | 33.6502 |
| 11.7356 | 47.0 | 6063 | 0.5777 | 33.7372 |
| 10.8846 | 48.0 | 6192 | 0.5716 | 34.2067 |
| 11.5715 | 49.0 | 6321 | 0.5709 | 34.1338 |
| 11.0337 | 50.0 | 6450 | 0.5775 | 34.0800 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
"er",
"mito",
"ld"
] |
nagarajuthirupathi/indoor_window_detection |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# output1
This model is a fine-tuned version of [nagarajuthirupathi/window_detection_model](https://huggingface.co/nagarajuthirupathi/window_detection_model) on the nagarajuthirupathi/indoor_window_detection_swf dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9.971930719778446e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"floor",
"door",
"blinds",
"curtain",
"windowpane"
] |
nagarajuthirupathi/swf_trained_model |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# output1
This model is a fine-tuned version of [mukesh3444/window_detection_model](https://huggingface.co/mukesh3444/window_detection_model) on the nagarajuthirupathi/indoor_window_detection_swf dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9.971930719778446e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"wall",
"building",
"sky",
"floor",
"tree",
"ceiling",
"road, route",
"bed",
"window ",
"grass",
"cabinet",
"sidewalk, pavement",
"person",
"earth, ground",
"door",
"table",
"mountain, mount",
"plant",
"curtain",
"chair",
"car",
"water",
"painting, picture",
"sofa",
"shelf",
"house",
"sea",
"mirror",
"rug",
"field",
"armchair",
"seat",
"fence",
"desk",
"rock, stone",
"wardrobe, closet, press",
"lamp",
"tub",
"rail",
"cushion",
"base, pedestal, stand",
"box",
"column, pillar",
"signboard, sign",
"chest of drawers, chest, bureau, dresser",
"counter",
"sand",
"sink",
"skyscraper",
"fireplace",
"refrigerator, icebox",
"grandstand, covered stand",
"path",
"stairs",
"runway",
"case, display case, showcase, vitrine",
"pool table, billiard table, snooker table",
"pillow",
"screen door, screen",
"stairway, staircase",
"river",
"bridge, span",
"bookcase",
"blind, screen",
"coffee table",
"toilet, can, commode, crapper, pot, potty, stool, throne",
"flower",
"book",
"hill",
"bench",
"countertop",
"stove",
"palm, palm tree",
"kitchen island",
"computer",
"swivel chair",
"boat",
"bar",
"arcade machine",
"hovel, hut, hutch, shack, shanty",
"bus",
"towel",
"light",
"truck",
"tower",
"chandelier",
"awning, sunshade, sunblind",
"street lamp",
"booth",
"tv",
"plane",
"dirt track",
"clothes",
"pole",
"land, ground, soil",
"bannister, banister, balustrade, balusters, handrail",
"escalator, moving staircase, moving stairway",
"ottoman, pouf, pouffe, puff, hassock",
"bottle",
"buffet, counter, sideboard",
"poster, posting, placard, notice, bill, card",
"stage",
"van",
"ship",
"fountain",
"conveyer belt, conveyor belt, conveyer, conveyor, transporter",
"canopy",
"washer, automatic washer, washing machine",
"plaything, toy",
"pool",
"stool",
"barrel, cask",
"basket, handbasket",
"falls",
"tent",
"bag",
"minibike, motorbike",
"cradle",
"oven",
"ball",
"food, solid food",
"step, stair",
"tank, storage tank",
"trade name",
"microwave",
"pot",
"animal",
"bicycle",
"lake",
"dishwasher",
"screen",
"blanket, cover",
"sculpture",
"hood, exhaust hood",
"sconce",
"vase",
"traffic light",
"tray",
"trash can",
"fan",
"pier",
"crt screen",
"plate",
"monitor",
"bulletin board",
"shower",
"radiator",
"glass, drinking glass",
"clock",
"flag"
] |
TommyClas/50d_seg_model_20250507 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# 50d_seg_model_20250507
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the TommyClas/50d_seg_20250505 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8496
- Mean Iou: 0.4618
- Mean Accuracy: 0.6974
- Overall Accuracy: 0.7722
- Accuracy 背景: nan
- Accuracy 孔隙: 0.7921
- Accuracy Ld c-s-h: 0.7918
- Accuracy Hd c-s-h: 0.3186
- Accuracy 未水化水泥颗粒: 0.8868
- Iou 背景: 0.0
- Iou 孔隙: 0.6924
- Iou Ld c-s-h: 0.5935
- Iou Hd c-s-h: 0.2372
- Iou 未水化水泥颗粒: 0.7862
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 10000
### Training results
| Training Loss | Epoch | Step | Accuracy Hd c-s-h | Accuracy Ld c-s-h | Accuracy 孔隙 | Accuracy 未水化水泥颗粒 | Accuracy 背景 | Iou Hd c-s-h | Iou Ld c-s-h | Iou 孔隙 | Iou 未水化水泥颗粒 | Iou 背景 | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy |
|:-------------:|:-----:|:-----:|:-----------------:|:-----------------:|:-----------:|:----------------:|:-----------:|:------------:|:------------:|:------:|:-----------:|:------:|:---------------:|:-------------:|:--------:|:----------------:|
| 1.0019 | 1.0 | 250 | 0.0 | 0.0650 | 0.7441 | 0.4815 | nan | 0.0 | 0.0544 | 0.3893 | 0.4701 | 0.0 | 4.4151 | 0.3226 | 0.1828 | 0.3786 |
| 0.4554 | 2.0 | 500 | 0.0000 | 0.6842 | 0.7487 | 0.8724 | nan | 0.0000 | 0.5197 | 0.5813 | 0.7626 | 0.0 | 1.2340 | 0.5763 | 0.3727 | 0.6884 |
| 0.4529 | 3.0 | 750 | 0.0000 | 0.7384 | 0.7863 | 0.8553 | nan | 0.0000 | 0.5339 | 0.6291 | 0.7592 | 0.0 | 0.9681 | 0.5950 | 0.3844 | 0.7172 |
| 0.4248 | 4.0 | 1000 | 0.2327 | 0.5245 | 0.8565 | 0.8152 | nan | 0.1599 | 0.4278 | 0.5931 | 0.7471 | 0.0 | 1.0964 | 0.6072 | 0.3856 | 0.6744 |
| 0.4194 | 5.0 | 1250 | 0.2258 | 0.7465 | 0.4263 | 0.9194 | nan | 0.1022 | 0.4743 | 0.4199 | 0.7475 | 0.0 | 0.9903 | 0.5795 | 0.3488 | 0.6308 |
| 0.4183 | 6.0 | 1500 | 0.5113 | 0.4379 | 0.9017 | 0.7679 | nan | 0.1996 | 0.3818 | 0.6551 | 0.7309 | 0.0 | 0.9263 | 0.6547 | 0.3935 | 0.6717 |
| 0.4178 | 7.0 | 1750 | 0.2577 | 0.5691 | 0.9078 | 0.8036 | nan | 0.1468 | 0.4584 | 0.6702 | 0.7487 | 0.0 | 0.8280 | 0.6346 | 0.4048 | 0.7077 |
| 0.4144 | 8.0 | 2000 | 0.1812 | 0.6915 | 0.6035 | 0.9596 | nan | 0.1060 | 0.4844 | 0.5706 | 0.6832 | 0.0 | 0.9423 | 0.6090 | 0.3688 | 0.6757 |
| 0.4148 | 9.0 | 2250 | 0.3482 | 0.1554 | 0.9229 | 0.5755 | nan | 0.1635 | 0.1371 | 0.4818 | 0.5680 | 0.0 | 1.7458 | 0.5005 | 0.2701 | 0.5215 |
| 0.4051 | 10.0 | 2500 | 0.0512 | 0.7194 | 0.8609 | 0.7365 | nan | 0.0456 | 0.5033 | 0.6617 | 0.6965 | 0.0 | 0.8936 | 0.5920 | 0.3814 | 0.7146 |
| 0.4027 | 11.0 | 2750 | 0.1803 | 0.5811 | 0.9251 | 0.8497 | nan | 0.1368 | 0.4711 | 0.6614 | 0.7522 | 0.0 | 0.8490 | 0.6340 | 0.4043 | 0.7212 |
| 0.4 | 12.0 | 3000 | 0.0871 | 0.7090 | 0.8824 | 0.8546 | nan | 0.0746 | 0.5374 | 0.6926 | 0.7589 | 0.0 | 0.7884 | 0.6333 | 0.4127 | 0.7463 |
| 0.3966 | 13.0 | 3250 | 0.3821 | 0.7329 | 0.8081 | 0.7223 | nan | 0.2117 | 0.5417 | 0.6819 | 0.6861 | 0.0 | 0.7998 | 0.6614 | 0.4243 | 0.7265 |
| 0.3809 | 14.0 | 3500 | 0.1311 | 0.6534 | 0.9270 | 0.8173 | nan | 0.1055 | 0.5109 | 0.6812 | 0.7604 | 0.0 | 0.8045 | 0.6322 | 0.4116 | 0.7370 |
| 0.3769 | 15.0 | 3750 | 0.2447 | 0.7358 | 0.3770 | 0.9680 | nan | 0.1478 | 0.4557 | 0.3679 | 0.6794 | 0.0 | 1.0931 | 0.5814 | 0.3301 | 0.6222 |
| 0.3742 | 16.0 | 4000 | 0.0718 | 0.6090 | 0.9529 | 0.8171 | nan | 0.0647 | 0.4802 | 0.6614 | 0.7576 | 0.0 | 0.8481 | 0.6127 | 0.3928 | 0.7246 |
| 0.3723 | 17.0 | 4250 | 0.1708 | 0.7345 | 0.8772 | 0.8962 | nan | 0.1393 | 0.5764 | 0.7142 | 0.7782 | 0.0 | 0.6791 | 0.6697 | 0.4416 | 0.7698 |
| 0.369 | 18.0 | 4500 | 0.0727 | 0.7378 | 0.9273 | 0.7286 | nan | 0.0679 | 0.5349 | 0.7077 | 0.7059 | 0.0 | 0.7373 | 0.6166 | 0.4033 | 0.7440 |
| 0.3812 | 19.0 | 4750 | 0.1705 | 0.8129 | 0.8411 | 0.8514 | nan | 0.1543 | 0.5980 | 0.7109 | 0.7786 | 0.0 | 0.6597 | 0.6690 | 0.4484 | 0.7764 |
| 0.3686 | 20.0 | 5000 | 0.1371 | 0.7812 | 0.8773 | 0.8351 | nan | 0.1248 | 0.5836 | 0.7143 | 0.7709 | 0.0 | 0.7417 | 0.6577 | 0.4387 | 0.7709 |
| 0.3675 | 21.0 | 5250 | 0.2222 | 0.7268 | 0.9023 | 0.8715 | nan | 0.1882 | 0.5762 | 0.7135 | 0.7853 | 0.0 | 0.7173 | 0.6807 | 0.4526 | 0.7746 |
| 0.3647 | 22.0 | 5500 | 0.2299 | 0.7688 | 0.8603 | 0.8955 | nan | 0.1904 | 0.5974 | 0.7207 | 0.7851 | 0.0 | 0.7049 | 0.6886 | 0.4587 | 0.7813 |
| 0.3629 | 23.0 | 5750 | 0.2192 | 0.6911 | 0.9342 | 0.8092 | nan | 0.1833 | 0.5437 | 0.7023 | 0.7632 | 0.0 | 0.8170 | 0.6634 | 0.4385 | 0.7589 |
| 0.3623 | 24.0 | 6000 | 0.2782 | 0.7554 | 0.8817 | 0.8655 | nan | 0.2243 | 0.5923 | 0.7212 | 0.7869 | 0.0 | 0.7217 | 0.6952 | 0.4649 | 0.7814 |
| 0.3609 | 25.0 | 6250 | 0.2384 | 0.7263 | 0.9150 | 0.8090 | nan | 0.1964 | 0.5627 | 0.7133 | 0.7635 | 0.0 | 0.8099 | 0.6722 | 0.4472 | 0.7667 |
| 0.3619 | 26.0 | 6500 | 0.2477 | 0.8017 | 0.8433 | 0.8570 | nan | 0.2032 | 0.6010 | 0.7189 | 0.7839 | 0.0 | 0.7394 | 0.6874 | 0.4614 | 0.7808 |
| 0.3603 | 27.0 | 6750 | 0.2353 | 0.8035 | 0.8323 | 0.8729 | nan | 0.1923 | 0.6014 | 0.7161 | 0.7879 | 0.0 | 0.7378 | 0.6860 | 0.4595 | 0.7801 |
| 0.3591 | 28.0 | 7000 | 0.2726 | 0.7587 | 0.8749 | 0.8612 | nan | 0.2141 | 0.5885 | 0.7218 | 0.7867 | 0.0 | 0.7468 | 0.6918 | 0.4622 | 0.7789 |
| 0.3584 | 29.0 | 7250 | 0.7495 | 0.4603 | 0.6958 | 0.7760 | nan | 0.7940 | 0.7926 | 0.2814 | 0.9150 | 0.0 | 0.7036 | 0.5987 | 0.2133 | 0.7861 |
| 0.3564 | 30.0 | 7500 | 0.7896 | 0.4520 | 0.6803 | 0.7733 | nan | 0.7788 | 0.8223 | 0.2201 | 0.9001 | 0.0 | 0.6955 | 0.6001 | 0.1781 | 0.7865 |
| 0.3567 | 31.0 | 7750 | 0.7667 | 0.4590 | 0.6863 | 0.7810 | nan | 0.8533 | 0.7825 | 0.2269 | 0.8827 | 0.0 | 0.7195 | 0.5977 | 0.1887 | 0.7891 |
| 0.3551 | 32.0 | 8000 | 0.7786 | 0.4616 | 0.6945 | 0.7776 | nan | 0.7922 | 0.8068 | 0.2755 | 0.9036 | 0.0 | 0.7005 | 0.6029 | 0.2179 | 0.7866 |
| 0.3539 | 33.0 | 8250 | 0.7576 | 0.4644 | 0.6960 | 0.7801 | nan | 0.8306 | 0.7869 | 0.2804 | 0.8862 | 0.0 | 0.7122 | 0.5988 | 0.2222 | 0.7889 |
| 0.354 | 34.0 | 8500 | 0.7926 | 0.4632 | 0.6966 | 0.7774 | nan | 0.7967 | 0.8027 | 0.2899 | 0.8969 | 0.0 | 0.7016 | 0.6007 | 0.2255 | 0.7881 |
| 0.3519 | 35.0 | 8750 | 0.7779 | 0.4626 | 0.6919 | 0.7771 | nan | 0.8519 | 0.7782 | 0.2869 | 0.8505 | 0.0 | 0.7137 | 0.5914 | 0.2252 | 0.7825 |
| 0.3511 | 36.0 | 9000 | 0.8104 | 0.4638 | 0.6987 | 0.7753 | nan | 0.7967 | 0.7967 | 0.3138 | 0.8878 | 0.0 | 0.6982 | 0.5980 | 0.2358 | 0.7873 |
| 0.3505 | 37.0 | 9250 | 0.8228 | 0.4613 | 0.6954 | 0.7735 | nan | 0.7830 | 0.8048 | 0.3016 | 0.8922 | 0.0 | 0.6916 | 0.5975 | 0.2307 | 0.7868 |
| 0.3499 | 38.0 | 9500 | 0.8103 | 0.4617 | 0.6939 | 0.7752 | nan | 0.8289 | 0.7767 | 0.2915 | 0.8786 | 0.0 | 0.7023 | 0.5905 | 0.2290 | 0.7869 |
| 0.3493 | 39.0 | 9750 | 0.8413 | 0.4616 | 0.6951 | 0.7733 | nan | 0.8075 | 0.7885 | 0.3065 | 0.8778 | 0.0 | 0.6970 | 0.5925 | 0.2325 | 0.7863 |
| 0.3481 | 40.0 | 10000 | 0.8496 | 0.4618 | 0.6974 | 0.7722 | nan | 0.7921 | 0.7918 | 0.3186 | 0.8868 | 0.0 | 0.6924 | 0.5935 | 0.2372 | 0.7862 |
### Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"背景",
"孔隙",
"ld c-s-h",
"hd c-s-h",
"未水化水泥颗粒"
] |
nagarajuthirupathi/test_model_nagaraju |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# output1
This model is a fine-tuned version of [mukesh3444/window_detection_model](https://huggingface.co/mukesh3444/window_detection_model) on the nagarajuthirupathi/indoor_window_detection_swf dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 9.971930719778446e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
| [
"wall",
"building",
"sky",
"floor",
"tree",
"ceiling",
"road, route",
"bed",
"window ",
"grass",
"cabinet",
"sidewalk, pavement",
"person",
"earth, ground",
"door",
"table",
"mountain, mount",
"plant",
"curtain",
"chair",
"car",
"water",
"painting, picture",
"sofa",
"shelf",
"house",
"sea",
"mirror",
"rug",
"field",
"armchair",
"seat",
"fence",
"desk",
"rock, stone",
"wardrobe, closet, press",
"lamp",
"tub",
"rail",
"cushion",
"base, pedestal, stand",
"box",
"column, pillar",
"signboard, sign",
"chest of drawers, chest, bureau, dresser",
"counter",
"sand",
"sink",
"skyscraper",
"fireplace",
"refrigerator, icebox",
"grandstand, covered stand",
"path",
"stairs",
"runway",
"case, display case, showcase, vitrine",
"pool table, billiard table, snooker table",
"pillow",
"screen door, screen",
"stairway, staircase",
"river",
"bridge, span",
"bookcase",
"blind, screen",
"coffee table",
"toilet, can, commode, crapper, pot, potty, stool, throne",
"flower",
"book",
"hill",
"bench",
"countertop",
"stove",
"palm, palm tree",
"kitchen island",
"computer",
"swivel chair",
"boat",
"bar",
"arcade machine",
"hovel, hut, hutch, shack, shanty",
"bus",
"towel",
"light",
"truck",
"tower",
"chandelier",
"awning, sunshade, sunblind",
"street lamp",
"booth",
"tv",
"plane",
"dirt track",
"clothes",
"pole",
"land, ground, soil",
"bannister, banister, balustrade, balusters, handrail",
"escalator, moving staircase, moving stairway",
"ottoman, pouf, pouffe, puff, hassock",
"bottle",
"buffet, counter, sideboard",
"poster, posting, placard, notice, bill, card",
"stage",
"van",
"ship",
"fountain",
"conveyer belt, conveyor belt, conveyer, conveyor, transporter",
"canopy",
"washer, automatic washer, washing machine",
"plaything, toy",
"pool",
"stool",
"barrel, cask",
"basket, handbasket",
"falls",
"tent",
"bag",
"minibike, motorbike",
"cradle",
"oven",
"ball",
"food, solid food",
"step, stair",
"tank, storage tank",
"trade name",
"microwave",
"pot",
"animal",
"bicycle",
"lake",
"dishwasher",
"screen",
"blanket, cover",
"sculpture",
"hood, exhaust hood",
"sconce",
"vase",
"traffic light",
"tray",
"trash can",
"fan",
"pier",
"crt screen",
"plate",
"monitor",
"bulletin board",
"shower",
"radiator",
"glass, drinking glass",
"clock",
"flag"
] |
Dnq2025/swin-large-mask2former-finetuned-Mitoliked-Mito-Innertongue |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-large-mask2former-finetuned-Mitoliked-Mito-Innertongue
This model is a fine-tuned version of [Dnq2025/swin-large-mask2former-finetuned-ER-Mito-LD8](https://huggingface.co/Dnq2025/swin-large-mask2former-finetuned-ER-Mito-LD8) on the Dnq2025/Mask2former_Finetune dataset.
It achieves the following results on the evaluation set:
- Mean Iou: 0.4849
- Loss: 84.8457
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 3225
### Training results
| Training Loss | Epoch | Step | Mean Iou | Validation Loss |
|:-------------:|:--------:|:----:|:--------:|:---------------:|
| No log | 16.6667 | 100 | 0.4737 | 34.6305 |
| 20.404 | 33.3333 | 200 | 0.4715 | 42.0013 |
| 10.0702 | 50.0 | 300 | 0.4774 | 46.1805 |
| 10.0702 | 66.6667 | 400 | 0.4821 | 50.9460 |
| 9.0859 | 83.3333 | 500 | 0.4920 | 48.5783 |
| 8.6094 | 100.0 | 600 | 0.4749 | 52.5121 |
| 8.6094 | 116.6667 | 700 | 0.4865 | 49.8120 |
| 8.4671 | 133.3333 | 800 | 0.4812 | 56.1730 |
| 8.3066 | 150.0 | 900 | 0.4934 | 60.8687 |
| 8.3066 | 166.6667 | 1000 | 0.4916 | 59.8411 |
| 8.1972 | 183.3333 | 1100 | 0.4916 | 63.6835 |
| 8.0926 | 200.0 | 1200 | 0.4905 | 60.2863 |
| 8.0926 | 216.6667 | 1300 | 0.4886 | 63.5495 |
| 8.0059 | 233.3333 | 1400 | 0.4924 | 65.0410 |
| 7.974 | 250.0 | 1500 | 0.4900 | 66.5468 |
| 7.974 | 266.6667 | 1600 | 0.4902 | 67.8736 |
| 8.0177 | 283.3333 | 1700 | 0.4851 | 68.4019 |
| 7.9356 | 300.0 | 1800 | 0.4894 | 74.3255 |
| 7.9356 | 316.6667 | 1900 | 0.4900 | 73.2780 |
| 7.8784 | 333.3333 | 2000 | 0.4899 | 75.2934 |
| 7.8445 | 350.0 | 2100 | 0.4878 | 77.9780 |
| 7.8445 | 366.6667 | 2200 | 0.4868 | 76.0033 |
| 7.7889 | 383.3333 | 2300 | 0.4879 | 78.4704 |
| 7.7828 | 400.0 | 2400 | 0.4860 | 77.1425 |
| 7.7828 | 416.6667 | 2500 | 0.4915 | 78.8530 |
| 7.7601 | 433.3333 | 2600 | 0.4875 | 81.9598 |
| 7.738 | 450.0 | 2700 | 0.4865 | 79.7205 |
| 7.738 | 466.6667 | 2800 | 0.4866 | 79.9854 |
| 7.7139 | 483.3333 | 2900 | 0.4869 | 82.6984 |
| 7.7043 | 500.0 | 3000 | 0.4847 | 81.8537 |
| 7.7043 | 516.6667 | 3100 | 0.4855 | 81.6806 |
| 7.6982 | 533.3333 | 3200 | 0.4850 | 85.2813 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"background",
" innertongue",
" mitochondria",
" mitolikeorganelle"
] |
PushkarA07/segformer-b0-finetuned-batch3-19May |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-batch3-19May
This model is a fine-tuned version of [PushkarA07/segformer-b0-finetuned-batch2w5-15Dec](https://huggingface.co/PushkarA07/segformer-b0-finetuned-batch2w5-15Dec) on the PushkarA07/batch3-tiles_first dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0057
- Mean Iou: 0.7819
- Mean Accuracy: 0.8573
- Overall Accuracy: 0.9984
- Accuracy Abnormality: 0.7153
- Iou Abnormality: 0.5654
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Abnormality | Iou Abnormality |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:---------------:|
| 0.005 | 0.9091 | 10 | 0.0059 | 0.7411 | 0.8270 | 0.9980 | 0.6550 | 0.4843 |
| 0.0038 | 1.8182 | 20 | 0.0061 | 0.7456 | 0.8639 | 0.9978 | 0.7293 | 0.4935 |
| 0.004 | 2.7273 | 30 | 0.0053 | 0.7557 | 0.8153 | 0.9983 | 0.6312 | 0.5132 |
| 0.0041 | 3.6364 | 40 | 0.0053 | 0.7559 | 0.8268 | 0.9982 | 0.6544 | 0.5137 |
| 0.0051 | 4.5455 | 50 | 0.0051 | 0.7659 | 0.8514 | 0.9982 | 0.7037 | 0.5335 |
| 0.0022 | 5.4545 | 60 | 0.0049 | 0.7669 | 0.8381 | 0.9983 | 0.6770 | 0.5356 |
| 0.0023 | 6.3636 | 70 | 0.0052 | 0.7667 | 0.8571 | 0.9982 | 0.7153 | 0.5353 |
| 0.0016 | 7.2727 | 80 | 0.0052 | 0.7679 | 0.8263 | 0.9984 | 0.6533 | 0.5375 |
| 0.0025 | 8.1818 | 90 | 0.0054 | 0.7687 | 0.8682 | 0.9982 | 0.7374 | 0.5392 |
| 0.0045 | 9.0909 | 100 | 0.0060 | 0.7622 | 0.8849 | 0.9980 | 0.7712 | 0.5264 |
| 0.0048 | 10.0 | 110 | 0.0054 | 0.7609 | 0.8045 | 0.9984 | 0.6094 | 0.5235 |
| 0.0035 | 10.9091 | 120 | 0.0054 | 0.7626 | 0.8587 | 0.9981 | 0.7185 | 0.5271 |
| 0.0033 | 11.8182 | 130 | 0.0051 | 0.7700 | 0.8375 | 0.9983 | 0.6758 | 0.5417 |
| 0.0022 | 12.7273 | 140 | 0.0052 | 0.7691 | 0.8476 | 0.9983 | 0.6961 | 0.5400 |
| 0.0039 | 13.6364 | 150 | 0.0055 | 0.7639 | 0.8710 | 0.9981 | 0.7432 | 0.5296 |
| 0.0034 | 14.5455 | 160 | 0.0051 | 0.7740 | 0.8457 | 0.9984 | 0.6922 | 0.5497 |
| 0.0039 | 15.4545 | 170 | 0.0053 | 0.7718 | 0.8653 | 0.9982 | 0.7316 | 0.5453 |
| 0.0024 | 16.3636 | 180 | 0.0053 | 0.7733 | 0.8620 | 0.9983 | 0.7248 | 0.5484 |
| 0.0023 | 17.2727 | 190 | 0.0051 | 0.7756 | 0.8424 | 0.9984 | 0.6855 | 0.5527 |
| 0.0022 | 18.1818 | 200 | 0.0053 | 0.7745 | 0.8748 | 0.9982 | 0.7507 | 0.5507 |
| 0.0041 | 19.0909 | 210 | 0.0051 | 0.7755 | 0.8470 | 0.9984 | 0.6947 | 0.5527 |
| 0.0025 | 20.0 | 220 | 0.0052 | 0.7791 | 0.8638 | 0.9983 | 0.7284 | 0.5599 |
| 0.0029 | 20.9091 | 230 | 0.0052 | 0.7760 | 0.8554 | 0.9983 | 0.7117 | 0.5537 |
| 0.0007 | 21.8182 | 240 | 0.0052 | 0.7760 | 0.8506 | 0.9984 | 0.7020 | 0.5537 |
| 0.0026 | 22.7273 | 250 | 0.0052 | 0.7782 | 0.8627 | 0.9983 | 0.7263 | 0.5581 |
| 0.0026 | 23.6364 | 260 | 0.0051 | 0.7791 | 0.8471 | 0.9984 | 0.6950 | 0.5597 |
| 0.0015 | 24.5455 | 270 | 0.0053 | 0.7754 | 0.8665 | 0.9983 | 0.7339 | 0.5525 |
| 0.0044 | 25.4545 | 280 | 0.0053 | 0.7773 | 0.8562 | 0.9983 | 0.7133 | 0.5563 |
| 0.0023 | 26.3636 | 290 | 0.0053 | 0.7790 | 0.8606 | 0.9984 | 0.7220 | 0.5597 |
| 0.0021 | 27.2727 | 300 | 0.0055 | 0.7769 | 0.8735 | 0.9983 | 0.7480 | 0.5556 |
| 0.0024 | 28.1818 | 310 | 0.0051 | 0.7809 | 0.8514 | 0.9984 | 0.7035 | 0.5633 |
| 0.0017 | 29.0909 | 320 | 0.0055 | 0.7786 | 0.8643 | 0.9983 | 0.7294 | 0.5589 |
| 0.003 | 30.0 | 330 | 0.0053 | 0.7777 | 0.8528 | 0.9984 | 0.7063 | 0.5570 |
| 0.0012 | 30.9091 | 340 | 0.0052 | 0.7760 | 0.8360 | 0.9984 | 0.6727 | 0.5535 |
| 0.003 | 31.8182 | 350 | 0.0054 | 0.7813 | 0.8648 | 0.9984 | 0.7305 | 0.5642 |
| 0.0043 | 32.7273 | 360 | 0.0052 | 0.7813 | 0.8465 | 0.9984 | 0.6937 | 0.5642 |
| 0.0032 | 33.6364 | 370 | 0.0052 | 0.7835 | 0.8587 | 0.9984 | 0.7181 | 0.5685 |
| 0.0025 | 34.5455 | 380 | 0.0053 | 0.7837 | 0.8637 | 0.9984 | 0.7281 | 0.5691 |
| 0.0017 | 35.4545 | 390 | 0.0051 | 0.7825 | 0.8380 | 0.9985 | 0.6766 | 0.5666 |
| 0.0024 | 36.3636 | 400 | 0.0055 | 0.7788 | 0.8704 | 0.9983 | 0.7418 | 0.5594 |
| 0.0026 | 37.2727 | 410 | 0.0053 | 0.7813 | 0.8533 | 0.9984 | 0.7073 | 0.5641 |
| 0.0023 | 38.1818 | 420 | 0.0051 | 0.7845 | 0.8646 | 0.9984 | 0.7300 | 0.5706 |
| 0.0023 | 39.0909 | 430 | 0.0053 | 0.7805 | 0.8648 | 0.9984 | 0.7304 | 0.5627 |
| 0.0021 | 40.0 | 440 | 0.0053 | 0.7823 | 0.8541 | 0.9984 | 0.7089 | 0.5662 |
| 0.0027 | 40.9091 | 450 | 0.0053 | 0.7834 | 0.8583 | 0.9984 | 0.7174 | 0.5683 |
| 0.004 | 41.8182 | 460 | 0.0052 | 0.7854 | 0.8545 | 0.9985 | 0.7096 | 0.5724 |
| 0.003 | 42.7273 | 470 | 0.0053 | 0.7826 | 0.8466 | 0.9985 | 0.6939 | 0.5668 |
| 0.0035 | 43.6364 | 480 | 0.0054 | 0.7815 | 0.8637 | 0.9984 | 0.7282 | 0.5646 |
| 0.0033 | 44.5455 | 490 | 0.0053 | 0.7802 | 0.8560 | 0.9984 | 0.7127 | 0.5620 |
| 0.0027 | 45.4545 | 500 | 0.0051 | 0.7828 | 0.8489 | 0.9985 | 0.6985 | 0.5672 |
| 0.0032 | 46.3636 | 510 | 0.0053 | 0.7836 | 0.8605 | 0.9984 | 0.7218 | 0.5687 |
| 0.0034 | 47.2727 | 520 | 0.0054 | 0.7830 | 0.8521 | 0.9984 | 0.7049 | 0.5675 |
| 0.0017 | 48.1818 | 530 | 0.0054 | 0.7833 | 0.8595 | 0.9984 | 0.7198 | 0.5681 |
| 0.003 | 49.0909 | 540 | 0.0054 | 0.7809 | 0.8509 | 0.9984 | 0.7024 | 0.5633 |
| 0.0013 | 50.0 | 550 | 0.0053 | 0.7841 | 0.8533 | 0.9985 | 0.7073 | 0.5697 |
| 0.0026 | 50.9091 | 560 | 0.0054 | 0.7828 | 0.8589 | 0.9984 | 0.7186 | 0.5671 |
| 0.0013 | 51.8182 | 570 | 0.0054 | 0.7831 | 0.8552 | 0.9984 | 0.7111 | 0.5677 |
| 0.0019 | 52.7273 | 580 | 0.0055 | 0.7808 | 0.8645 | 0.9984 | 0.7298 | 0.5632 |
| 0.0024 | 53.6364 | 590 | 0.0054 | 0.7828 | 0.8550 | 0.9984 | 0.7107 | 0.5671 |
| 0.0024 | 54.5455 | 600 | 0.0054 | 0.7837 | 0.8593 | 0.9984 | 0.7193 | 0.5690 |
| 0.0025 | 55.4545 | 610 | 0.0055 | 0.7818 | 0.8566 | 0.9984 | 0.7140 | 0.5653 |
| 0.0018 | 56.3636 | 620 | 0.0054 | 0.7846 | 0.8509 | 0.9985 | 0.7025 | 0.5707 |
| 0.0027 | 57.2727 | 630 | 0.0054 | 0.7830 | 0.8571 | 0.9984 | 0.7149 | 0.5675 |
| 0.0017 | 58.1818 | 640 | 0.0054 | 0.7833 | 0.8575 | 0.9984 | 0.7158 | 0.5682 |
| 0.0038 | 59.0909 | 650 | 0.0054 | 0.7855 | 0.8585 | 0.9984 | 0.7178 | 0.5725 |
| 0.0018 | 60.0 | 660 | 0.0058 | 0.7780 | 0.8628 | 0.9983 | 0.7266 | 0.5576 |
| 0.0023 | 60.9091 | 670 | 0.0056 | 0.7809 | 0.8534 | 0.9984 | 0.7075 | 0.5634 |
| 0.002 | 61.8182 | 680 | 0.0055 | 0.7841 | 0.8549 | 0.9984 | 0.7105 | 0.5698 |
| 0.0038 | 62.7273 | 690 | 0.0055 | 0.7822 | 0.8562 | 0.9984 | 0.7132 | 0.5661 |
| 0.0024 | 63.6364 | 700 | 0.0055 | 0.7813 | 0.8556 | 0.9984 | 0.7120 | 0.5642 |
| 0.0027 | 64.5455 | 710 | 0.0055 | 0.7828 | 0.8597 | 0.9984 | 0.7202 | 0.5672 |
| 0.0019 | 65.4545 | 720 | 0.0056 | 0.7813 | 0.8562 | 0.9984 | 0.7131 | 0.5642 |
| 0.0018 | 66.3636 | 730 | 0.0056 | 0.7818 | 0.8595 | 0.9984 | 0.7199 | 0.5653 |
| 0.0034 | 67.2727 | 740 | 0.0055 | 0.7825 | 0.8560 | 0.9984 | 0.7127 | 0.5666 |
| 0.0021 | 68.1818 | 750 | 0.0056 | 0.7818 | 0.8540 | 0.9984 | 0.7087 | 0.5652 |
| 0.0014 | 69.0909 | 760 | 0.0056 | 0.7818 | 0.8542 | 0.9984 | 0.7091 | 0.5652 |
| 0.002 | 70.0 | 770 | 0.0056 | 0.7819 | 0.8600 | 0.9984 | 0.7207 | 0.5654 |
| 0.0009 | 70.9091 | 780 | 0.0057 | 0.7814 | 0.8586 | 0.9984 | 0.7181 | 0.5645 |
| 0.0014 | 71.8182 | 790 | 0.0054 | 0.7836 | 0.8604 | 0.9984 | 0.7216 | 0.5688 |
| 0.0014 | 72.7273 | 800 | 0.0057 | 0.7799 | 0.8571 | 0.9984 | 0.7150 | 0.5615 |
| 0.0036 | 73.6364 | 810 | 0.0058 | 0.7796 | 0.8603 | 0.9984 | 0.7214 | 0.5607 |
| 0.0024 | 74.5455 | 820 | 0.0058 | 0.7787 | 0.8648 | 0.9983 | 0.7304 | 0.5590 |
| 0.0036 | 75.4545 | 830 | 0.0058 | 0.7786 | 0.8580 | 0.9984 | 0.7168 | 0.5589 |
| 0.0016 | 76.3636 | 840 | 0.0057 | 0.7811 | 0.8626 | 0.9984 | 0.7260 | 0.5639 |
| 0.0023 | 77.2727 | 850 | 0.0057 | 0.7810 | 0.8586 | 0.9984 | 0.7181 | 0.5637 |
| 0.0018 | 78.1818 | 860 | 0.0057 | 0.7808 | 0.8576 | 0.9984 | 0.7161 | 0.5633 |
| 0.0019 | 79.0909 | 870 | 0.0058 | 0.7797 | 0.8637 | 0.9983 | 0.7283 | 0.5610 |
| 0.0022 | 80.0 | 880 | 0.0057 | 0.7817 | 0.8526 | 0.9984 | 0.7058 | 0.5650 |
| 0.0023 | 80.9091 | 890 | 0.0059 | 0.7796 | 0.8617 | 0.9984 | 0.7243 | 0.5608 |
| 0.0019 | 81.8182 | 900 | 0.0058 | 0.7803 | 0.8572 | 0.9984 | 0.7152 | 0.5622 |
| 0.003 | 82.7273 | 910 | 0.0058 | 0.7802 | 0.8557 | 0.9984 | 0.7121 | 0.5619 |
| 0.0024 | 83.6364 | 920 | 0.0058 | 0.7809 | 0.8611 | 0.9984 | 0.7230 | 0.5634 |
| 0.0014 | 84.5455 | 930 | 0.0058 | 0.7817 | 0.8581 | 0.9984 | 0.7169 | 0.5650 |
| 0.0017 | 85.4545 | 940 | 0.0058 | 0.7815 | 0.8587 | 0.9984 | 0.7181 | 0.5645 |
| 0.0031 | 86.3636 | 950 | 0.0058 | 0.7804 | 0.8589 | 0.9984 | 0.7187 | 0.5624 |
| 0.003 | 87.2727 | 960 | 0.0058 | 0.7809 | 0.8568 | 0.9984 | 0.7143 | 0.5634 |
| 0.0007 | 88.1818 | 970 | 0.0058 | 0.7806 | 0.8594 | 0.9984 | 0.7195 | 0.5629 |
| 0.0024 | 89.0909 | 980 | 0.0057 | 0.7821 | 0.8577 | 0.9984 | 0.7162 | 0.5657 |
| 0.0039 | 90.0 | 990 | 0.0057 | 0.7826 | 0.8596 | 0.9984 | 0.7199 | 0.5669 |
| 0.0016 | 90.9091 | 1000 | 0.0057 | 0.7826 | 0.8578 | 0.9984 | 0.7163 | 0.5669 |
| 0.0024 | 91.8182 | 1010 | 0.0057 | 0.7822 | 0.8574 | 0.9984 | 0.7156 | 0.5659 |
| 0.0027 | 92.7273 | 1020 | 0.0057 | 0.7817 | 0.8593 | 0.9984 | 0.7195 | 0.5651 |
| 0.0018 | 93.6364 | 1030 | 0.0057 | 0.7821 | 0.8591 | 0.9984 | 0.7190 | 0.5658 |
| 0.0012 | 94.5455 | 1040 | 0.0057 | 0.7821 | 0.8581 | 0.9984 | 0.7169 | 0.5659 |
| 0.0017 | 95.4545 | 1050 | 0.0057 | 0.7816 | 0.8587 | 0.9984 | 0.7181 | 0.5648 |
| 0.0022 | 96.3636 | 1060 | 0.0057 | 0.7815 | 0.8582 | 0.9984 | 0.7172 | 0.5647 |
| 0.0019 | 97.2727 | 1070 | 0.0058 | 0.7818 | 0.8580 | 0.9984 | 0.7168 | 0.5652 |
| 0.0041 | 98.1818 | 1080 | 0.0057 | 0.7816 | 0.8575 | 0.9984 | 0.7157 | 0.5648 |
| 0.0034 | 99.0909 | 1090 | 0.0057 | 0.7815 | 0.8566 | 0.9984 | 0.7139 | 0.5646 |
| 0.0016 | 100.0 | 1100 | 0.0057 | 0.7819 | 0.8573 | 0.9984 | 0.7153 | 0.5654 |
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"normal",
"abnormality"
] |
Prajakta8/segformer-b0-finetuned-segments-sidewalk-oct-22 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-segments-sidewalk-oct-22
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"unlabeled",
"flat-road",
"flat-sidewalk",
"flat-crosswalk",
"flat-cyclinglane",
"flat-parkingdriveway",
"flat-railtrack",
"flat-curb",
"human-person",
"human-rider",
"vehicle-car",
"vehicle-truck",
"vehicle-bus",
"vehicle-tramtrain",
"vehicle-motorcycle",
"vehicle-bicycle",
"vehicle-caravan",
"vehicle-cartrailer",
"construction-building",
"construction-door",
"construction-wall",
"construction-fenceguardrail",
"construction-bridge",
"construction-tunnel",
"construction-stairs",
"object-pole",
"object-trafficsign",
"object-trafficlight",
"nature-vegetation",
"nature-terrain",
"sky",
"void-ground",
"void-dynamic",
"void-static",
"void-unclear"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_2
This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the NICOPOI-9/morphpad_coord_hgo_30_30_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7675
- Mean Iou: 0.4538
- Mean Accuracy: 0.6212
- Overall Accuracy: 0.6213
- Accuracy 0-0: 0.5825
- Accuracy 0-90: 0.5481
- Accuracy 90-0: 0.7589
- Accuracy 90-90: 0.5953
- Iou 0-0: 0.4693
- Iou 0-90: 0.4525
- Iou 90-0: 0.4219
- Iou 90-90: 0.4714
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 1.1766 | 4.2105 | 4000 | 1.1683 | 0.2583 | 0.4140 | 0.4143 | 0.2931 | 0.3809 | 0.6404 | 0.3417 | 0.2408 | 0.2494 | 0.2769 | 0.2663 |
| 1.0168 | 8.4211 | 8000 | 1.0180 | 0.3414 | 0.5100 | 0.5089 | 0.7153 | 0.4170 | 0.4442 | 0.4636 | 0.3414 | 0.3305 | 0.3283 | 0.3653 |
| 0.8472 | 12.6316 | 12000 | 0.8536 | 0.4077 | 0.5792 | 0.5793 | 0.6031 | 0.6125 | 0.4758 | 0.6254 | 0.4108 | 0.3947 | 0.4040 | 0.4211 |
| 1.0326 | 16.8421 | 16000 | 0.7675 | 0.4538 | 0.6212 | 0.6213 | 0.5825 | 0.5481 | 0.7589 | 0.5953 | 0.4693 | 0.4525 | 0.4219 | 0.4714 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_60epochs |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_60epochs
This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the NICOPOI-9/morphpad_coord_hgo_30_30_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5626
- Mean Iou: 0.5820
- Mean Accuracy: 0.7358
- Overall Accuracy: 0.7358
- Accuracy 0-0: 0.7456
- Accuracy 0-90: 0.7128
- Accuracy 90-0: 0.7363
- Accuracy 90-90: 0.7484
- Iou 0-0: 0.5840
- Iou 0-90: 0.5781
- Iou 90-0: 0.5720
- Iou 90-90: 0.5939
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 60
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 1.2478 | 4.2105 | 4000 | 1.2564 | 0.2012 | 0.3564 | 0.3563 | 0.2794 | 0.7320 | 0.1626 | 0.2516 | 0.2015 | 0.2645 | 0.1424 | 0.1964 |
| 1.1864 | 8.4211 | 8000 | 1.0945 | 0.2822 | 0.4420 | 0.4430 | 0.3826 | 0.4025 | 0.3519 | 0.6312 | 0.2902 | 0.2692 | 0.2660 | 0.3036 |
| 0.9632 | 12.6316 | 12000 | 0.9682 | 0.3432 | 0.5103 | 0.5103 | 0.4745 | 0.4817 | 0.6326 | 0.4526 | 0.3457 | 0.3355 | 0.3377 | 0.3539 |
| 1.0223 | 16.8421 | 16000 | 0.8653 | 0.4020 | 0.5689 | 0.5690 | 0.4846 | 0.7162 | 0.5767 | 0.4982 | 0.4109 | 0.3743 | 0.3890 | 0.4336 |
| 0.7388 | 21.0526 | 20000 | 0.7888 | 0.4382 | 0.6064 | 0.6068 | 0.5402 | 0.6197 | 0.6163 | 0.6494 | 0.4767 | 0.4090 | 0.4268 | 0.4403 |
| 0.7634 | 25.2632 | 24000 | 0.7226 | 0.4711 | 0.6404 | 0.6406 | 0.6547 | 0.6184 | 0.5925 | 0.6962 | 0.4872 | 0.4634 | 0.4606 | 0.4733 |
| 0.6536 | 29.4737 | 28000 | 0.6801 | 0.4993 | 0.6654 | 0.6657 | 0.6463 | 0.6443 | 0.6653 | 0.7058 | 0.5182 | 0.4909 | 0.4806 | 0.5074 |
| 0.6216 | 33.6842 | 32000 | 0.6512 | 0.5192 | 0.6821 | 0.6826 | 0.6793 | 0.6185 | 0.6460 | 0.7848 | 0.5362 | 0.5204 | 0.5184 | 0.5019 |
| 0.6402 | 37.8947 | 36000 | 0.6295 | 0.5309 | 0.6932 | 0.6931 | 0.7050 | 0.7227 | 0.6512 | 0.6938 | 0.5298 | 0.5108 | 0.5348 | 0.5482 |
| 0.7389 | 42.1053 | 40000 | 0.6110 | 0.5475 | 0.7076 | 0.7077 | 0.7126 | 0.6793 | 0.7010 | 0.7374 | 0.5522 | 0.5449 | 0.5321 | 0.5610 |
| 0.6753 | 46.3158 | 44000 | 0.5858 | 0.5631 | 0.7203 | 0.7202 | 0.7338 | 0.6868 | 0.7393 | 0.7212 | 0.5700 | 0.5556 | 0.5437 | 0.5831 |
| 0.4944 | 50.5263 | 48000 | 0.5762 | 0.5711 | 0.7264 | 0.7264 | 0.7266 | 0.6984 | 0.7537 | 0.7268 | 0.5827 | 0.5703 | 0.5430 | 0.5885 |
| 0.4953 | 54.7368 | 52000 | 0.5676 | 0.5804 | 0.7337 | 0.7336 | 0.7532 | 0.6790 | 0.7716 | 0.7310 | 0.5759 | 0.5939 | 0.5535 | 0.5983 |
| 0.4828 | 58.9474 | 56000 | 0.5626 | 0.5820 | 0.7358 | 0.7358 | 0.7456 | 0.7128 | 0.7363 | 0.7484 | 0.5840 | 0.5781 | 0.5720 | 0.5939 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
PushkarA07/segformer-b0-finetuned-batch3-26May |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-batch3-26May
This model is a fine-tuned version of [PushkarA07/segformer-b0-finetuned-batch2w5-15Dec](https://huggingface.co/PushkarA07/segformer-b0-finetuned-batch2w5-15Dec) on the PushkarA07/batch3-tiles_second dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0014
- Mean Iou: 0.8838
- Mean Accuracy: 0.9271
- Overall Accuracy: 0.9994
- Accuracy Abnormality: 0.8545
- Iou Abnormality: 0.7682
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Abnormality | Iou Abnormality |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:---------------:|
| 0.0035 | 0.7143 | 10 | 0.0024 | 0.8344 | 0.8791 | 0.9992 | 0.7585 | 0.6697 |
| 0.0021 | 1.4286 | 20 | 0.0022 | 0.8422 | 0.8942 | 0.9992 | 0.7888 | 0.6852 |
| 0.0033 | 2.1429 | 30 | 0.0020 | 0.8474 | 0.8942 | 0.9992 | 0.7887 | 0.6956 |
| 0.007 | 2.8571 | 40 | 0.0020 | 0.8510 | 0.8943 | 0.9992 | 0.7889 | 0.7028 |
| 0.0036 | 3.5714 | 50 | 0.0019 | 0.8553 | 0.8983 | 0.9993 | 0.7968 | 0.7113 |
| 0.0032 | 4.2857 | 60 | 0.0018 | 0.8583 | 0.8969 | 0.9993 | 0.7940 | 0.7173 |
| 0.0026 | 5.0 | 70 | 0.0018 | 0.8594 | 0.9003 | 0.9993 | 0.8009 | 0.7195 |
| 0.0033 | 5.7143 | 80 | 0.0018 | 0.8600 | 0.8999 | 0.9993 | 0.8000 | 0.7207 |
| 0.0048 | 6.4286 | 90 | 0.0018 | 0.8616 | 0.8997 | 0.9993 | 0.7997 | 0.7239 |
| 0.003 | 7.1429 | 100 | 0.0017 | 0.8653 | 0.9105 | 0.9993 | 0.8214 | 0.7313 |
| 0.0026 | 7.8571 | 110 | 0.0017 | 0.8664 | 0.9122 | 0.9993 | 0.8248 | 0.7335 |
| 0.0022 | 8.5714 | 120 | 0.0017 | 0.8599 | 0.8902 | 0.9993 | 0.7806 | 0.7206 |
| 0.0023 | 9.2857 | 130 | 0.0017 | 0.8668 | 0.9069 | 0.9993 | 0.8140 | 0.7342 |
| 0.0028 | 10.0 | 140 | 0.0017 | 0.8681 | 0.9177 | 0.9993 | 0.8357 | 0.7368 |
| 0.0017 | 10.7143 | 150 | 0.0017 | 0.8672 | 0.9107 | 0.9993 | 0.8217 | 0.7350 |
| 0.002 | 11.4286 | 160 | 0.0017 | 0.8665 | 0.9020 | 0.9993 | 0.8041 | 0.7336 |
| 0.0018 | 12.1429 | 170 | 0.0017 | 0.8676 | 0.9047 | 0.9993 | 0.8097 | 0.7358 |
| 0.0021 | 12.8571 | 180 | 0.0017 | 0.8695 | 0.9240 | 0.9993 | 0.8484 | 0.7397 |
| 0.001 | 13.5714 | 190 | 0.0016 | 0.8700 | 0.9145 | 0.9993 | 0.8293 | 0.7407 |
| 0.0014 | 14.2857 | 200 | 0.0016 | 0.8721 | 0.9123 | 0.9994 | 0.8248 | 0.7449 |
| 0.0016 | 15.0 | 210 | 0.0016 | 0.8704 | 0.9082 | 0.9994 | 0.8166 | 0.7414 |
| 0.0023 | 15.7143 | 220 | 0.0017 | 0.8709 | 0.9175 | 0.9993 | 0.8352 | 0.7425 |
| 0.0023 | 16.4286 | 230 | 0.0016 | 0.8732 | 0.9188 | 0.9994 | 0.8379 | 0.7470 |
| 0.0019 | 17.1429 | 240 | 0.0016 | 0.8731 | 0.9153 | 0.9994 | 0.8308 | 0.7468 |
| 0.0018 | 17.8571 | 250 | 0.0016 | 0.8726 | 0.9094 | 0.9994 | 0.8191 | 0.7459 |
| 0.0027 | 18.5714 | 260 | 0.0016 | 0.8697 | 0.9016 | 0.9994 | 0.8033 | 0.7400 |
| 0.0013 | 19.2857 | 270 | 0.0016 | 0.8758 | 0.9214 | 0.9994 | 0.8431 | 0.7523 |
| 0.0036 | 20.0 | 280 | 0.0016 | 0.8750 | 0.9226 | 0.9994 | 0.8456 | 0.7506 |
| 0.0025 | 20.7143 | 290 | 0.0016 | 0.8751 | 0.9293 | 0.9994 | 0.8589 | 0.7509 |
| 0.0016 | 21.4286 | 300 | 0.0016 | 0.8725 | 0.9095 | 0.9994 | 0.8192 | 0.7457 |
| 0.0032 | 22.1429 | 310 | 0.0016 | 0.8737 | 0.9102 | 0.9994 | 0.8206 | 0.7481 |
| 0.002 | 22.8571 | 320 | 0.0016 | 0.8772 | 0.9304 | 0.9994 | 0.8610 | 0.7550 |
| 0.0012 | 23.5714 | 330 | 0.0016 | 0.8760 | 0.9151 | 0.9994 | 0.8304 | 0.7526 |
| 0.0012 | 24.2857 | 340 | 0.0016 | 0.8767 | 0.9226 | 0.9994 | 0.8454 | 0.7541 |
| 0.0028 | 25.0 | 350 | 0.0015 | 0.8771 | 0.9205 | 0.9994 | 0.8413 | 0.7548 |
| 0.0021 | 25.7143 | 360 | 0.0015 | 0.8769 | 0.9176 | 0.9994 | 0.8354 | 0.7543 |
| 0.0016 | 26.4286 | 370 | 0.0015 | 0.8763 | 0.9156 | 0.9994 | 0.8315 | 0.7533 |
| 0.0025 | 27.1429 | 380 | 0.0016 | 0.8742 | 0.9171 | 0.9994 | 0.8345 | 0.7490 |
| 0.0029 | 27.8571 | 390 | 0.0016 | 0.8763 | 0.9322 | 0.9994 | 0.8647 | 0.7532 |
| 0.0025 | 28.5714 | 400 | 0.0015 | 0.8767 | 0.9194 | 0.9994 | 0.8391 | 0.7539 |
| 0.0022 | 29.2857 | 410 | 0.0015 | 0.8783 | 0.9205 | 0.9994 | 0.8413 | 0.7572 |
| 0.0014 | 30.0 | 420 | 0.0016 | 0.8792 | 0.9318 | 0.9994 | 0.8639 | 0.7590 |
| 0.0027 | 30.7143 | 430 | 0.0015 | 0.8786 | 0.9269 | 0.9994 | 0.8541 | 0.7578 |
| 0.0038 | 31.4286 | 440 | 0.0015 | 0.8787 | 0.9270 | 0.9994 | 0.8544 | 0.7581 |
| 0.0014 | 32.1429 | 450 | 0.0015 | 0.8781 | 0.9243 | 0.9994 | 0.8490 | 0.7569 |
| 0.0015 | 32.8571 | 460 | 0.0015 | 0.8781 | 0.9214 | 0.9994 | 0.8431 | 0.7568 |
| 0.0034 | 33.5714 | 470 | 0.0015 | 0.8769 | 0.9136 | 0.9994 | 0.8275 | 0.7544 |
| 0.0048 | 34.2857 | 480 | 0.0015 | 0.8783 | 0.9310 | 0.9994 | 0.8623 | 0.7573 |
| 0.0025 | 35.0 | 490 | 0.0015 | 0.8783 | 0.9210 | 0.9994 | 0.8422 | 0.7572 |
| 0.0029 | 35.7143 | 500 | 0.0015 | 0.8788 | 0.9234 | 0.9994 | 0.8470 | 0.7582 |
| 0.0024 | 36.4286 | 510 | 0.0015 | 0.8797 | 0.9286 | 0.9994 | 0.8576 | 0.7600 |
| 0.0013 | 37.1429 | 520 | 0.0015 | 0.8792 | 0.9197 | 0.9994 | 0.8396 | 0.7589 |
| 0.0023 | 37.8571 | 530 | 0.0015 | 0.8797 | 0.9240 | 0.9994 | 0.8484 | 0.7601 |
| 0.0017 | 38.5714 | 540 | 0.0015 | 0.8802 | 0.9269 | 0.9994 | 0.8541 | 0.7610 |
| 0.0023 | 39.2857 | 550 | 0.0015 | 0.8801 | 0.9248 | 0.9994 | 0.8498 | 0.7609 |
| 0.0027 | 40.0 | 560 | 0.0015 | 0.8806 | 0.9297 | 0.9994 | 0.8596 | 0.7618 |
| 0.0029 | 40.7143 | 570 | 0.0015 | 0.8798 | 0.9209 | 0.9994 | 0.8420 | 0.7602 |
| 0.0035 | 41.4286 | 580 | 0.0015 | 0.8776 | 0.9129 | 0.9994 | 0.8261 | 0.7557 |
| 0.0025 | 42.1429 | 590 | 0.0015 | 0.8792 | 0.9197 | 0.9994 | 0.8397 | 0.7590 |
| 0.0011 | 42.8571 | 600 | 0.0015 | 0.8813 | 0.9312 | 0.9994 | 0.8628 | 0.7632 |
| 0.0022 | 43.5714 | 610 | 0.0015 | 0.8803 | 0.9254 | 0.9994 | 0.8511 | 0.7612 |
| 0.0029 | 44.2857 | 620 | 0.0015 | 0.8796 | 0.9199 | 0.9994 | 0.8400 | 0.7597 |
| 0.0017 | 45.0 | 630 | 0.0015 | 0.8808 | 0.9254 | 0.9994 | 0.8511 | 0.7621 |
| 0.0013 | 45.7143 | 640 | 0.0015 | 0.8815 | 0.9276 | 0.9994 | 0.8554 | 0.7635 |
| 0.0026 | 46.4286 | 650 | 0.0015 | 0.8798 | 0.9258 | 0.9994 | 0.8518 | 0.7602 |
| 0.0018 | 47.1429 | 660 | 0.0015 | 0.8803 | 0.9307 | 0.9994 | 0.8616 | 0.7613 |
| 0.0016 | 47.8571 | 670 | 0.0015 | 0.8811 | 0.9272 | 0.9994 | 0.8546 | 0.7627 |
| 0.001 | 48.5714 | 680 | 0.0015 | 0.8796 | 0.9160 | 0.9994 | 0.8321 | 0.7598 |
| 0.002 | 49.2857 | 690 | 0.0015 | 0.8807 | 0.9314 | 0.9994 | 0.8632 | 0.7621 |
| 0.0021 | 50.0 | 700 | 0.0015 | 0.8797 | 0.9235 | 0.9994 | 0.8473 | 0.7600 |
| 0.0019 | 50.7143 | 710 | 0.0015 | 0.8800 | 0.9229 | 0.9994 | 0.8461 | 0.7606 |
| 0.0013 | 51.4286 | 720 | 0.0015 | 0.8794 | 0.9212 | 0.9994 | 0.8427 | 0.7593 |
| 0.0032 | 52.1429 | 730 | 0.0015 | 0.8806 | 0.9229 | 0.9994 | 0.8461 | 0.7618 |
| 0.0015 | 52.8571 | 740 | 0.0015 | 0.8813 | 0.9268 | 0.9994 | 0.8540 | 0.7632 |
| 0.0027 | 53.5714 | 750 | 0.0015 | 0.8807 | 0.9235 | 0.9994 | 0.8472 | 0.7620 |
| 0.0018 | 54.2857 | 760 | 0.0015 | 0.8806 | 0.9209 | 0.9994 | 0.8420 | 0.7618 |
| 0.0028 | 55.0 | 770 | 0.0015 | 0.8817 | 0.9245 | 0.9994 | 0.8493 | 0.7639 |
| 0.0019 | 55.7143 | 780 | 0.0015 | 0.8797 | 0.9154 | 0.9994 | 0.8311 | 0.7601 |
| 0.0017 | 56.4286 | 790 | 0.0015 | 0.8815 | 0.9238 | 0.9994 | 0.8479 | 0.7636 |
| 0.001 | 57.1429 | 800 | 0.0015 | 0.8811 | 0.9227 | 0.9994 | 0.8456 | 0.7628 |
| 0.0022 | 57.8571 | 810 | 0.0015 | 0.8827 | 0.9303 | 0.9994 | 0.8610 | 0.7660 |
| 0.0012 | 58.5714 | 820 | 0.0015 | 0.8825 | 0.9237 | 0.9994 | 0.8475 | 0.7656 |
| 0.0018 | 59.2857 | 830 | 0.0015 | 0.8831 | 0.9268 | 0.9994 | 0.8538 | 0.7667 |
| 0.0022 | 60.0 | 840 | 0.0015 | 0.8821 | 0.9263 | 0.9994 | 0.8530 | 0.7649 |
| 0.0021 | 60.7143 | 850 | 0.0015 | 0.8821 | 0.9246 | 0.9994 | 0.8494 | 0.7647 |
| 0.0033 | 61.4286 | 860 | 0.0015 | 0.8818 | 0.9277 | 0.9994 | 0.8556 | 0.7642 |
| 0.0034 | 62.1429 | 870 | 0.0015 | 0.8817 | 0.9231 | 0.9994 | 0.8465 | 0.7640 |
| 0.0037 | 62.8571 | 880 | 0.0015 | 0.8821 | 0.9261 | 0.9994 | 0.8524 | 0.7647 |
| 0.0018 | 63.5714 | 890 | 0.0015 | 0.8831 | 0.9334 | 0.9994 | 0.8671 | 0.7668 |
| 0.0018 | 64.2857 | 900 | 0.0015 | 0.8830 | 0.9316 | 0.9994 | 0.8635 | 0.7667 |
| 0.0035 | 65.0 | 910 | 0.0015 | 0.8806 | 0.9153 | 0.9994 | 0.8309 | 0.7618 |
| 0.0018 | 65.7143 | 920 | 0.0015 | 0.8814 | 0.9312 | 0.9994 | 0.8627 | 0.7635 |
| 0.0015 | 66.4286 | 930 | 0.0015 | 0.8826 | 0.9264 | 0.9994 | 0.8531 | 0.7657 |
| 0.0016 | 67.1429 | 940 | 0.0015 | 0.8836 | 0.9358 | 0.9994 | 0.8719 | 0.7677 |
| 0.0023 | 67.8571 | 950 | 0.0015 | 0.8820 | 0.9215 | 0.9994 | 0.8433 | 0.7645 |
| 0.0015 | 68.5714 | 960 | 0.0015 | 0.8816 | 0.9283 | 0.9994 | 0.8569 | 0.7639 |
| 0.0023 | 69.2857 | 970 | 0.0015 | 0.8833 | 0.9302 | 0.9994 | 0.8607 | 0.7673 |
| 0.0036 | 70.0 | 980 | 0.0015 | 0.8824 | 0.9256 | 0.9994 | 0.8515 | 0.7654 |
| 0.0011 | 70.7143 | 990 | 0.0015 | 0.8816 | 0.9268 | 0.9994 | 0.8538 | 0.7637 |
| 0.0034 | 71.4286 | 1000 | 0.0015 | 0.8818 | 0.9267 | 0.9994 | 0.8536 | 0.7643 |
| 0.0014 | 72.1429 | 1010 | 0.0015 | 0.8833 | 0.9303 | 0.9994 | 0.8609 | 0.7672 |
| 0.0011 | 72.8571 | 1020 | 0.0015 | 0.8827 | 0.9287 | 0.9994 | 0.8577 | 0.7659 |
| 0.0014 | 73.5714 | 1030 | 0.0015 | 0.8819 | 0.9257 | 0.9994 | 0.8516 | 0.7643 |
| 0.001 | 74.2857 | 1040 | 0.0015 | 0.8829 | 0.9294 | 0.9994 | 0.8591 | 0.7665 |
| 0.0026 | 75.0 | 1050 | 0.0015 | 0.8812 | 0.9210 | 0.9994 | 0.8423 | 0.7629 |
| 0.0014 | 75.7143 | 1060 | 0.0015 | 0.8823 | 0.9288 | 0.9994 | 0.8579 | 0.7653 |
| 0.0029 | 76.4286 | 1070 | 0.0015 | 0.8825 | 0.9244 | 0.9994 | 0.8490 | 0.7656 |
| 0.0007 | 77.1429 | 1080 | 0.0015 | 0.8828 | 0.9268 | 0.9994 | 0.8538 | 0.7662 |
| 0.0021 | 77.8571 | 1090 | 0.0015 | 0.8829 | 0.9257 | 0.9994 | 0.8517 | 0.7664 |
| 0.002 | 78.5714 | 1100 | 0.0015 | 0.8835 | 0.9270 | 0.9994 | 0.8542 | 0.7675 |
| 0.0025 | 79.2857 | 1110 | 0.0015 | 0.8833 | 0.9276 | 0.9994 | 0.8554 | 0.7673 |
| 0.0026 | 80.0 | 1120 | 0.0015 | 0.8831 | 0.9255 | 0.9994 | 0.8513 | 0.7669 |
| 0.0035 | 80.7143 | 1130 | 0.0015 | 0.8841 | 0.9306 | 0.9994 | 0.8615 | 0.7689 |
| 0.0016 | 81.4286 | 1140 | 0.0015 | 0.8833 | 0.9256 | 0.9994 | 0.8515 | 0.7672 |
| 0.0018 | 82.1429 | 1150 | 0.0015 | 0.8828 | 0.9266 | 0.9994 | 0.8535 | 0.7661 |
| 0.0024 | 82.8571 | 1160 | 0.0015 | 0.8831 | 0.9280 | 0.9994 | 0.8563 | 0.7668 |
| 0.0022 | 83.5714 | 1170 | 0.0015 | 0.8836 | 0.9309 | 0.9994 | 0.8620 | 0.7677 |
| 0.0018 | 84.2857 | 1180 | 0.0015 | 0.8835 | 0.9303 | 0.9994 | 0.8608 | 0.7676 |
| 0.0014 | 85.0 | 1190 | 0.0015 | 0.8832 | 0.9266 | 0.9994 | 0.8535 | 0.7669 |
| 0.0013 | 85.7143 | 1200 | 0.0015 | 0.8838 | 0.9273 | 0.9994 | 0.8548 | 0.7682 |
| 0.0033 | 86.4286 | 1210 | 0.0014 | 0.8836 | 0.9316 | 0.9994 | 0.8635 | 0.7678 |
| 0.0023 | 87.1429 | 1220 | 0.0015 | 0.8831 | 0.9231 | 0.9994 | 0.8465 | 0.7667 |
| 0.0027 | 87.8571 | 1230 | 0.0014 | 0.8834 | 0.9284 | 0.9994 | 0.8571 | 0.7675 |
| 0.0014 | 88.5714 | 1240 | 0.0015 | 0.8833 | 0.9285 | 0.9994 | 0.8572 | 0.7672 |
| 0.0025 | 89.2857 | 1250 | 0.0014 | 0.8836 | 0.9276 | 0.9994 | 0.8555 | 0.7678 |
| 0.003 | 90.0 | 1260 | 0.0014 | 0.8842 | 0.9299 | 0.9994 | 0.8600 | 0.7690 |
| 0.0022 | 90.7143 | 1270 | 0.0014 | 0.8842 | 0.9271 | 0.9994 | 0.8545 | 0.7690 |
| 0.0024 | 91.4286 | 1280 | 0.0014 | 0.8839 | 0.9285 | 0.9994 | 0.8572 | 0.7684 |
| 0.0017 | 92.1429 | 1290 | 0.0014 | 0.8835 | 0.9262 | 0.9994 | 0.8526 | 0.7676 |
| 0.0014 | 92.8571 | 1300 | 0.0014 | 0.8830 | 0.9243 | 0.9994 | 0.8488 | 0.7666 |
| 0.0018 | 93.5714 | 1310 | 0.0014 | 0.8836 | 0.9293 | 0.9994 | 0.8589 | 0.7678 |
| 0.0019 | 94.2857 | 1320 | 0.0014 | 0.8833 | 0.9265 | 0.9994 | 0.8532 | 0.7673 |
| 0.0027 | 95.0 | 1330 | 0.0015 | 0.8831 | 0.9250 | 0.9994 | 0.8503 | 0.7667 |
| 0.0008 | 95.7143 | 1340 | 0.0015 | 0.8835 | 0.9275 | 0.9994 | 0.8553 | 0.7677 |
| 0.0033 | 96.4286 | 1350 | 0.0015 | 0.8835 | 0.9280 | 0.9994 | 0.8563 | 0.7675 |
| 0.0021 | 97.1429 | 1360 | 0.0015 | 0.8834 | 0.9268 | 0.9994 | 0.8540 | 0.7675 |
| 0.0016 | 97.8571 | 1370 | 0.0015 | 0.8836 | 0.9271 | 0.9994 | 0.8545 | 0.7677 |
| 0.0039 | 98.5714 | 1380 | 0.0014 | 0.8835 | 0.9255 | 0.9994 | 0.8513 | 0.7677 |
| 0.0017 | 99.2857 | 1390 | 0.0014 | 0.8836 | 0.9257 | 0.9994 | 0.8517 | 0.7677 |
| 0.0046 | 100.0 | 1400 | 0.0014 | 0.8838 | 0.9271 | 0.9994 | 0.8545 | 0.7682 |
### Framework versions
- Transformers 4.52.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"normal",
"abnormality"
] |
PushkarA07/segformer-b0-finetuned-batch3-26May-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-batch3-26May-2
This model is a fine-tuned version of [PushkarA07/segformer-b0-finetuned-batch2w5-15Dec](https://huggingface.co/PushkarA07/segformer-b0-finetuned-batch2w5-15Dec) on the PushkarA07/batch3-tiles_third dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0007
- Mean Iou: 0.9173
- Mean Accuracy: 0.9515
- Overall Accuracy: 0.9997
- Accuracy Abnormality: 0.9030
- Iou Abnormality: 0.8348
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Abnormality | Iou Abnormality |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:---------------:|
| 0.0012 | 0.7143 | 10 | 0.0017 | 0.8437 | 0.8917 | 0.9994 | 0.7835 | 0.6879 |
| 0.0012 | 1.4286 | 20 | 0.0013 | 0.8539 | 0.8779 | 0.9995 | 0.7559 | 0.7082 |
| 0.001 | 2.1429 | 30 | 0.0012 | 0.8684 | 0.8944 | 0.9996 | 0.7889 | 0.7372 |
| 0.0006 | 2.8571 | 40 | 0.0011 | 0.8746 | 0.8991 | 0.9996 | 0.7983 | 0.7496 |
| 0.001 | 3.5714 | 50 | 0.0010 | 0.8839 | 0.9185 | 0.9996 | 0.8371 | 0.7681 |
| 0.0012 | 4.2857 | 60 | 0.0010 | 0.8867 | 0.9189 | 0.9996 | 0.8380 | 0.7737 |
| 0.0022 | 5.0 | 70 | 0.0010 | 0.8901 | 0.9211 | 0.9996 | 0.8423 | 0.7806 |
| 0.0017 | 5.7143 | 80 | 0.0009 | 0.8913 | 0.9254 | 0.9996 | 0.8510 | 0.7829 |
| 0.0016 | 6.4286 | 90 | 0.0009 | 0.8921 | 0.9237 | 0.9996 | 0.8475 | 0.7846 |
| 0.001 | 7.1429 | 100 | 0.0009 | 0.8946 | 0.9278 | 0.9996 | 0.8557 | 0.7895 |
| 0.0012 | 7.8571 | 110 | 0.0009 | 0.8935 | 0.9226 | 0.9996 | 0.8453 | 0.7873 |
| 0.0011 | 8.5714 | 120 | 0.0009 | 0.8963 | 0.9314 | 0.9996 | 0.8629 | 0.7929 |
| 0.001 | 9.2857 | 130 | 0.0009 | 0.8980 | 0.9325 | 0.9996 | 0.8652 | 0.7963 |
| 0.0006 | 10.0 | 140 | 0.0009 | 0.8978 | 0.9303 | 0.9996 | 0.8608 | 0.7959 |
| 0.001 | 10.7143 | 150 | 0.0009 | 0.8996 | 0.9366 | 0.9997 | 0.8732 | 0.7995 |
| 0.001 | 11.4286 | 160 | 0.0009 | 0.9016 | 0.9463 | 0.9997 | 0.8928 | 0.8036 |
| 0.0004 | 12.1429 | 170 | 0.0009 | 0.9019 | 0.9494 | 0.9997 | 0.8990 | 0.8042 |
| 0.0002 | 12.8571 | 180 | 0.0009 | 0.9004 | 0.9341 | 0.9997 | 0.8683 | 0.8012 |
| 0.0011 | 13.5714 | 190 | 0.0009 | 0.9026 | 0.9488 | 0.9997 | 0.8977 | 0.8055 |
| 0.0005 | 14.2857 | 200 | 0.0008 | 0.9014 | 0.9385 | 0.9997 | 0.8772 | 0.8031 |
| 0.0007 | 15.0 | 210 | 0.0008 | 0.9013 | 0.9354 | 0.9997 | 0.8709 | 0.8028 |
| 0.0013 | 15.7143 | 220 | 0.0008 | 0.9047 | 0.9445 | 0.9997 | 0.8892 | 0.8098 |
| 0.0004 | 16.4286 | 230 | 0.0008 | 0.9015 | 0.9334 | 0.9997 | 0.8670 | 0.8034 |
| 0.0009 | 17.1429 | 240 | 0.0008 | 0.9057 | 0.9500 | 0.9997 | 0.9002 | 0.8117 |
| 0.0016 | 17.8571 | 250 | 0.0008 | 0.9060 | 0.9451 | 0.9997 | 0.8904 | 0.8124 |
| 0.0011 | 18.5714 | 260 | 0.0008 | 0.9052 | 0.9432 | 0.9997 | 0.8865 | 0.8107 |
| 0.0007 | 19.2857 | 270 | 0.0008 | 0.9069 | 0.9476 | 0.9997 | 0.8953 | 0.8141 |
| 0.0007 | 20.0 | 280 | 0.0008 | 0.9073 | 0.9488 | 0.9997 | 0.8977 | 0.8150 |
| 0.001 | 20.7143 | 290 | 0.0008 | 0.9033 | 0.9329 | 0.9997 | 0.8660 | 0.8068 |
| 0.0006 | 21.4286 | 300 | 0.0008 | 0.9079 | 0.9492 | 0.9997 | 0.8985 | 0.8162 |
| 0.0009 | 22.1429 | 310 | 0.0008 | 0.9070 | 0.9494 | 0.9997 | 0.8990 | 0.8143 |
| 0.0007 | 22.8571 | 320 | 0.0008 | 0.9070 | 0.9438 | 0.9997 | 0.8877 | 0.8142 |
| 0.0006 | 23.5714 | 330 | 0.0008 | 0.9071 | 0.9458 | 0.9997 | 0.8918 | 0.8146 |
| 0.001 | 24.2857 | 340 | 0.0008 | 0.9088 | 0.9455 | 0.9997 | 0.8912 | 0.8179 |
| 0.0006 | 25.0 | 350 | 0.0008 | 0.9105 | 0.9477 | 0.9997 | 0.8955 | 0.8214 |
| 0.0009 | 25.7143 | 360 | 0.0008 | 0.9090 | 0.9477 | 0.9997 | 0.8955 | 0.8184 |
| 0.001 | 26.4286 | 370 | 0.0008 | 0.9096 | 0.9521 | 0.9997 | 0.9043 | 0.8196 |
| 0.0012 | 27.1429 | 380 | 0.0008 | 0.9089 | 0.9465 | 0.9997 | 0.8931 | 0.8181 |
| 0.0006 | 27.8571 | 390 | 0.0008 | 0.9100 | 0.9487 | 0.9997 | 0.8976 | 0.8203 |
| 0.0006 | 28.5714 | 400 | 0.0008 | 0.9097 | 0.9484 | 0.9997 | 0.8970 | 0.8198 |
| 0.0004 | 29.2857 | 410 | 0.0008 | 0.9088 | 0.9565 | 0.9997 | 0.9131 | 0.8179 |
| 0.0013 | 30.0 | 420 | 0.0008 | 0.9073 | 0.9413 | 0.9997 | 0.8828 | 0.8150 |
| 0.0007 | 30.7143 | 430 | 0.0008 | 0.9086 | 0.9441 | 0.9997 | 0.8883 | 0.8176 |
| 0.0011 | 31.4286 | 440 | 0.0008 | 0.9109 | 0.9575 | 0.9997 | 0.9151 | 0.8221 |
| 0.0004 | 32.1429 | 450 | 0.0008 | 0.9112 | 0.9525 | 0.9997 | 0.9051 | 0.8227 |
| 0.0011 | 32.8571 | 460 | 0.0008 | 0.9118 | 0.9469 | 0.9997 | 0.8939 | 0.8239 |
| 0.0006 | 33.5714 | 470 | 0.0008 | 0.9112 | 0.9559 | 0.9997 | 0.9119 | 0.8228 |
| 0.0004 | 34.2857 | 480 | 0.0008 | 0.9104 | 0.9535 | 0.9997 | 0.9072 | 0.8210 |
| 0.0006 | 35.0 | 490 | 0.0008 | 0.9107 | 0.9450 | 0.9997 | 0.8902 | 0.8218 |
| 0.0011 | 35.7143 | 500 | 0.0008 | 0.9128 | 0.9509 | 0.9997 | 0.9019 | 0.8258 |
| 0.0004 | 36.4286 | 510 | 0.0008 | 0.9118 | 0.9502 | 0.9997 | 0.9005 | 0.8239 |
| 0.0007 | 37.1429 | 520 | 0.0008 | 0.9135 | 0.9534 | 0.9997 | 0.9070 | 0.8273 |
| 0.0005 | 37.8571 | 530 | 0.0008 | 0.9106 | 0.9422 | 0.9997 | 0.8845 | 0.8216 |
| 0.0011 | 38.5714 | 540 | 0.0008 | 0.9125 | 0.9501 | 0.9997 | 0.9004 | 0.8252 |
| 0.0006 | 39.2857 | 550 | 0.0008 | 0.9130 | 0.9553 | 0.9997 | 0.9107 | 0.8264 |
| 0.001 | 40.0 | 560 | 0.0008 | 0.9110 | 0.9454 | 0.9997 | 0.8909 | 0.8224 |
| 0.001 | 40.7143 | 570 | 0.0008 | 0.9135 | 0.9546 | 0.9997 | 0.9094 | 0.8272 |
| 0.0009 | 41.4286 | 580 | 0.0008 | 0.9131 | 0.9529 | 0.9997 | 0.9060 | 0.8265 |
| 0.0007 | 42.1429 | 590 | 0.0008 | 0.9112 | 0.9479 | 0.9997 | 0.8959 | 0.8227 |
| 0.0005 | 42.8571 | 600 | 0.0007 | 0.9131 | 0.9514 | 0.9997 | 0.9029 | 0.8265 |
| 0.0005 | 43.5714 | 610 | 0.0008 | 0.9110 | 0.9435 | 0.9997 | 0.8871 | 0.8224 |
| 0.0005 | 44.2857 | 620 | 0.0008 | 0.9126 | 0.9575 | 0.9997 | 0.9152 | 0.8255 |
| 0.0003 | 45.0 | 630 | 0.0007 | 0.9121 | 0.9480 | 0.9997 | 0.8962 | 0.8244 |
| 0.0003 | 45.7143 | 640 | 0.0008 | 0.9109 | 0.9432 | 0.9997 | 0.8865 | 0.8221 |
| 0.0006 | 46.4286 | 650 | 0.0007 | 0.9139 | 0.9519 | 0.9997 | 0.9039 | 0.8281 |
| 0.0003 | 47.1429 | 660 | 0.0008 | 0.9132 | 0.9547 | 0.9997 | 0.9096 | 0.8267 |
| 0.0012 | 47.8571 | 670 | 0.0008 | 0.9114 | 0.9444 | 0.9997 | 0.8888 | 0.8230 |
| 0.0008 | 48.5714 | 680 | 0.0007 | 0.9138 | 0.9546 | 0.9997 | 0.9093 | 0.8279 |
| 0.001 | 49.2857 | 690 | 0.0007 | 0.9136 | 0.9512 | 0.9997 | 0.9025 | 0.8275 |
| 0.0009 | 50.0 | 700 | 0.0007 | 0.9127 | 0.9490 | 0.9997 | 0.8982 | 0.8258 |
| 0.0006 | 50.7143 | 710 | 0.0007 | 0.9143 | 0.9527 | 0.9997 | 0.9055 | 0.8289 |
| 0.0011 | 51.4286 | 720 | 0.0007 | 0.9127 | 0.9475 | 0.9997 | 0.8951 | 0.8257 |
| 0.0003 | 52.1429 | 730 | 0.0007 | 0.9138 | 0.9500 | 0.9997 | 0.9002 | 0.8280 |
| 0.0005 | 52.8571 | 740 | 0.0007 | 0.9141 | 0.9541 | 0.9997 | 0.9083 | 0.8285 |
| 0.0011 | 53.5714 | 750 | 0.0007 | 0.9146 | 0.9526 | 0.9997 | 0.9052 | 0.8295 |
| 0.0005 | 54.2857 | 760 | 0.0007 | 0.9139 | 0.9509 | 0.9997 | 0.9019 | 0.8281 |
| 0.0005 | 55.0 | 770 | 0.0007 | 0.9134 | 0.9468 | 0.9997 | 0.8937 | 0.8270 |
| 0.0009 | 55.7143 | 780 | 0.0007 | 0.9150 | 0.9528 | 0.9997 | 0.9058 | 0.8302 |
| 0.0011 | 56.4286 | 790 | 0.0007 | 0.9133 | 0.9461 | 0.9997 | 0.8924 | 0.8268 |
| 0.0015 | 57.1429 | 800 | 0.0007 | 0.9143 | 0.9507 | 0.9997 | 0.9016 | 0.8289 |
| 0.0009 | 57.8571 | 810 | 0.0007 | 0.9148 | 0.9509 | 0.9997 | 0.9019 | 0.8299 |
| 0.0006 | 58.5714 | 820 | 0.0007 | 0.9146 | 0.9507 | 0.9997 | 0.9015 | 0.8294 |
| 0.0003 | 59.2857 | 830 | 0.0007 | 0.9152 | 0.9530 | 0.9997 | 0.9062 | 0.8307 |
| 0.0006 | 60.0 | 840 | 0.0007 | 0.9144 | 0.9487 | 0.9997 | 0.8974 | 0.8292 |
| 0.0006 | 60.7143 | 850 | 0.0007 | 0.9149 | 0.9529 | 0.9997 | 0.9060 | 0.8300 |
| 0.0006 | 61.4286 | 860 | 0.0007 | 0.9159 | 0.9556 | 0.9997 | 0.9115 | 0.8320 |
| 0.0004 | 62.1429 | 870 | 0.0007 | 0.9143 | 0.9499 | 0.9997 | 0.8999 | 0.8288 |
| 0.0008 | 62.8571 | 880 | 0.0007 | 0.9150 | 0.9537 | 0.9997 | 0.9076 | 0.8303 |
| 0.0008 | 63.5714 | 890 | 0.0007 | 0.9154 | 0.9493 | 0.9997 | 0.8987 | 0.8311 |
| 0.0006 | 64.2857 | 900 | 0.0007 | 0.9158 | 0.9572 | 0.9997 | 0.9146 | 0.8319 |
| 0.0013 | 65.0 | 910 | 0.0007 | 0.9150 | 0.9509 | 0.9997 | 0.9020 | 0.8304 |
| 0.0008 | 65.7143 | 920 | 0.0007 | 0.9148 | 0.9487 | 0.9997 | 0.8974 | 0.8300 |
| 0.0009 | 66.4286 | 930 | 0.0007 | 0.9164 | 0.9555 | 0.9997 | 0.9111 | 0.8332 |
| 0.0007 | 67.1429 | 940 | 0.0007 | 0.9167 | 0.9521 | 0.9997 | 0.9043 | 0.8337 |
| 0.0005 | 67.8571 | 950 | 0.0007 | 0.9163 | 0.9540 | 0.9997 | 0.9082 | 0.8328 |
| 0.0009 | 68.5714 | 960 | 0.0007 | 0.9157 | 0.9489 | 0.9997 | 0.8979 | 0.8316 |
| 0.001 | 69.2857 | 970 | 0.0007 | 0.9160 | 0.9548 | 0.9997 | 0.9098 | 0.8322 |
| 0.0006 | 70.0 | 980 | 0.0007 | 0.9156 | 0.9492 | 0.9997 | 0.8985 | 0.8315 |
| 0.001 | 70.7143 | 990 | 0.0007 | 0.9160 | 0.9507 | 0.9997 | 0.9015 | 0.8323 |
| 0.0006 | 71.4286 | 1000 | 0.0007 | 0.9154 | 0.9484 | 0.9997 | 0.8970 | 0.8310 |
| 0.0014 | 72.1429 | 1010 | 0.0007 | 0.9165 | 0.9534 | 0.9997 | 0.9068 | 0.8332 |
| 0.0008 | 72.8571 | 1020 | 0.0007 | 0.9165 | 0.9513 | 0.9997 | 0.9028 | 0.8333 |
| 0.0007 | 73.5714 | 1030 | 0.0007 | 0.9167 | 0.9530 | 0.9997 | 0.9061 | 0.8338 |
| 0.0008 | 74.2857 | 1040 | 0.0007 | 0.9159 | 0.9526 | 0.9997 | 0.9052 | 0.8321 |
| 0.0006 | 75.0 | 1050 | 0.0007 | 0.9154 | 0.9503 | 0.9997 | 0.9007 | 0.8312 |
| 0.0007 | 75.7143 | 1060 | 0.0007 | 0.9165 | 0.9545 | 0.9997 | 0.9091 | 0.8332 |
| 0.0011 | 76.4286 | 1070 | 0.0007 | 0.9168 | 0.9543 | 0.9997 | 0.9087 | 0.8338 |
| 0.0009 | 77.1429 | 1080 | 0.0007 | 0.9158 | 0.9527 | 0.9997 | 0.9055 | 0.8320 |
| 0.0005 | 77.8571 | 1090 | 0.0007 | 0.9168 | 0.9511 | 0.9997 | 0.9023 | 0.8338 |
| 0.0005 | 78.5714 | 1100 | 0.0007 | 0.9162 | 0.9502 | 0.9997 | 0.9005 | 0.8328 |
| 0.0009 | 79.2857 | 1110 | 0.0007 | 0.9174 | 0.9533 | 0.9997 | 0.9068 | 0.8350 |
| 0.0004 | 80.0 | 1120 | 0.0007 | 0.9162 | 0.9495 | 0.9997 | 0.8990 | 0.8327 |
| 0.0002 | 80.7143 | 1130 | 0.0007 | 0.9165 | 0.9507 | 0.9997 | 0.9014 | 0.8332 |
| 0.0005 | 81.4286 | 1140 | 0.0007 | 0.9164 | 0.9499 | 0.9997 | 0.8999 | 0.8332 |
| 0.0009 | 82.1429 | 1150 | 0.0007 | 0.9170 | 0.9543 | 0.9997 | 0.9087 | 0.8342 |
| 0.0009 | 82.8571 | 1160 | 0.0007 | 0.9165 | 0.9523 | 0.9997 | 0.9048 | 0.8334 |
| 0.0006 | 83.5714 | 1170 | 0.0007 | 0.9165 | 0.9519 | 0.9997 | 0.9039 | 0.8332 |
| 0.0008 | 84.2857 | 1180 | 0.0007 | 0.9161 | 0.9515 | 0.9997 | 0.9032 | 0.8325 |
| 0.0006 | 85.0 | 1190 | 0.0007 | 0.9169 | 0.9525 | 0.9997 | 0.9051 | 0.8340 |
| 0.0005 | 85.7143 | 1200 | 0.0007 | 0.9167 | 0.9518 | 0.9997 | 0.9037 | 0.8337 |
| 0.0002 | 86.4286 | 1210 | 0.0007 | 0.9167 | 0.9519 | 0.9997 | 0.9040 | 0.8337 |
| 0.0004 | 87.1429 | 1220 | 0.0007 | 0.9167 | 0.9518 | 0.9997 | 0.9037 | 0.8337 |
| 0.0009 | 87.8571 | 1230 | 0.0007 | 0.9169 | 0.9520 | 0.9997 | 0.9042 | 0.8340 |
| 0.0011 | 88.5714 | 1240 | 0.0007 | 0.9171 | 0.9526 | 0.9997 | 0.9053 | 0.8345 |
| 0.0006 | 89.2857 | 1250 | 0.0007 | 0.9171 | 0.9518 | 0.9997 | 0.9037 | 0.8346 |
| 0.0007 | 90.0 | 1260 | 0.0007 | 0.9174 | 0.9551 | 0.9997 | 0.9104 | 0.8351 |
| 0.0005 | 90.7143 | 1270 | 0.0007 | 0.9168 | 0.9534 | 0.9997 | 0.9069 | 0.8340 |
| 0.0007 | 91.4286 | 1280 | 0.0007 | 0.9169 | 0.9519 | 0.9997 | 0.9040 | 0.8341 |
| 0.0009 | 92.1429 | 1290 | 0.0007 | 0.9175 | 0.9526 | 0.9997 | 0.9052 | 0.8352 |
| 0.0009 | 92.8571 | 1300 | 0.0007 | 0.9177 | 0.9532 | 0.9997 | 0.9066 | 0.8356 |
| 0.0007 | 93.5714 | 1310 | 0.0007 | 0.9174 | 0.9525 | 0.9997 | 0.9051 | 0.8351 |
| 0.0007 | 94.2857 | 1320 | 0.0007 | 0.9170 | 0.9518 | 0.9997 | 0.9037 | 0.8343 |
| 0.0015 | 95.0 | 1330 | 0.0007 | 0.9173 | 0.9535 | 0.9997 | 0.9071 | 0.8349 |
| 0.0005 | 95.7143 | 1340 | 0.0007 | 0.9176 | 0.9534 | 0.9997 | 0.9069 | 0.8355 |
| 0.0007 | 96.4286 | 1350 | 0.0007 | 0.9174 | 0.9525 | 0.9997 | 0.9051 | 0.8351 |
| 0.001 | 97.1429 | 1360 | 0.0007 | 0.9175 | 0.9527 | 0.9997 | 0.9056 | 0.8353 |
| 0.001 | 97.8571 | 1370 | 0.0007 | 0.9175 | 0.9526 | 0.9997 | 0.9052 | 0.8354 |
| 0.0007 | 98.5714 | 1380 | 0.0007 | 0.9173 | 0.9518 | 0.9997 | 0.9037 | 0.8349 |
| 0.0006 | 99.2857 | 1390 | 0.0007 | 0.9175 | 0.9514 | 0.9997 | 0.9029 | 0.8352 |
| 0.0011 | 100.0 | 1400 | 0.0007 | 0.9173 | 0.9515 | 0.9997 | 0.9030 | 0.8348 |
### Framework versions
- Transformers 4.52.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"normal",
"abnormality"
] |
NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_120epochs |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_120epochs
This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the NICOPOI-9/morphpad_coord_hgo_30_30_512_4class dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3128
- Mean Iou: 0.7857
- Mean Accuracy: 0.8800
- Overall Accuracy: 0.8799
- Accuracy 0-0: 0.8782
- Accuracy 0-90: 0.8842
- Accuracy 90-0: 0.8809
- Accuracy 90-90: 0.8765
- Iou 0-0: 0.7869
- Iou 0-90: 0.7762
- Iou 90-0: 0.7880
- Iou 90-90: 0.7916
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:--------:|:------:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 1.188 | 4.2105 | 4000 | 1.2023 | 0.2499 | 0.3996 | 0.3995 | 0.3256 | 0.4999 | 0.4603 | 0.3124 | 0.2510 | 0.2542 | 0.2429 | 0.2514 |
| 0.9829 | 8.4211 | 8000 | 0.9446 | 0.3676 | 0.5326 | 0.5326 | 0.4568 | 0.5068 | 0.7157 | 0.4508 | 0.3872 | 0.3492 | 0.3434 | 0.3907 |
| 0.7539 | 12.6316 | 12000 | 0.7804 | 0.4475 | 0.6163 | 0.6168 | 0.5690 | 0.6618 | 0.5314 | 0.7030 | 0.4806 | 0.4244 | 0.4335 | 0.4513 |
| 0.969 | 16.8421 | 16000 | 0.6554 | 0.5152 | 0.6775 | 0.6780 | 0.6172 | 0.7013 | 0.6543 | 0.7372 | 0.5533 | 0.4849 | 0.5090 | 0.5136 |
| 0.5649 | 21.0526 | 20000 | 0.5879 | 0.5597 | 0.7156 | 0.7157 | 0.6964 | 0.7740 | 0.6842 | 0.7078 | 0.5869 | 0.5181 | 0.5555 | 0.5782 |
| 0.5983 | 25.2632 | 24000 | 0.5229 | 0.5963 | 0.7464 | 0.7460 | 0.8236 | 0.7139 | 0.7132 | 0.7348 | 0.5647 | 0.6058 | 0.6038 | 0.6110 |
| 0.5077 | 29.4737 | 28000 | 0.4971 | 0.6133 | 0.7587 | 0.7585 | 0.7767 | 0.7577 | 0.7964 | 0.7042 | 0.6236 | 0.6053 | 0.5724 | 0.6519 |
| 0.456 | 33.6842 | 32000 | 0.4880 | 0.6346 | 0.7763 | 0.7764 | 0.7792 | 0.7575 | 0.7675 | 0.8011 | 0.6434 | 0.6257 | 0.6376 | 0.6317 |
| 0.4278 | 37.8947 | 36000 | 0.4139 | 0.6688 | 0.8015 | 0.8014 | 0.8149 | 0.7962 | 0.8101 | 0.7849 | 0.6663 | 0.6599 | 0.6611 | 0.6880 |
| 0.4974 | 42.1053 | 40000 | 0.3921 | 0.6863 | 0.8132 | 0.8132 | 0.7997 | 0.8370 | 0.8165 | 0.7996 | 0.7065 | 0.6548 | 0.6794 | 0.7046 |
| 0.4364 | 46.3158 | 44000 | 0.3697 | 0.7023 | 0.8244 | 0.8247 | 0.7941 | 0.8293 | 0.8225 | 0.8520 | 0.7248 | 0.6884 | 0.6982 | 0.6978 |
| 0.3254 | 50.5263 | 48000 | 0.3521 | 0.7152 | 0.8340 | 0.8338 | 0.8529 | 0.8268 | 0.8315 | 0.8247 | 0.7016 | 0.7133 | 0.7162 | 0.7295 |
| 0.3139 | 54.7368 | 52000 | 0.3471 | 0.7224 | 0.8386 | 0.8386 | 0.8343 | 0.8526 | 0.8305 | 0.8371 | 0.7209 | 0.7032 | 0.7306 | 0.7347 |
| 0.3209 | 58.9474 | 56000 | 0.3253 | 0.7359 | 0.8479 | 0.8479 | 0.8565 | 0.8363 | 0.8525 | 0.8463 | 0.7340 | 0.7348 | 0.7294 | 0.7455 |
| 0.2815 | 63.1579 | 60000 | 0.3234 | 0.7421 | 0.8516 | 0.8516 | 0.8468 | 0.8707 | 0.8431 | 0.8459 | 0.7466 | 0.7148 | 0.7463 | 0.7608 |
| 0.3002 | 67.3684 | 64000 | 0.3132 | 0.7520 | 0.8584 | 0.8584 | 0.8545 | 0.8623 | 0.8503 | 0.8663 | 0.7544 | 0.7411 | 0.7560 | 0.7566 |
| 0.2874 | 71.5789 | 68000 | 0.3068 | 0.7571 | 0.8615 | 0.8615 | 0.8582 | 0.8814 | 0.8540 | 0.8524 | 0.7628 | 0.7325 | 0.7639 | 0.7693 |
| 1.0781 | 75.7895 | 72000 | 0.3185 | 0.7524 | 0.8588 | 0.8586 | 0.8755 | 0.8528 | 0.8649 | 0.8419 | 0.7446 | 0.7461 | 0.7503 | 0.7686 |
| 0.2688 | 80.0 | 76000 | 0.2993 | 0.7663 | 0.8676 | 0.8676 | 0.8688 | 0.8677 | 0.8639 | 0.8702 | 0.7693 | 0.7553 | 0.7674 | 0.7730 |
| 0.2566 | 84.2105 | 80000 | 0.2962 | 0.7696 | 0.8698 | 0.8698 | 0.8669 | 0.8687 | 0.8673 | 0.8761 | 0.7729 | 0.7574 | 0.7739 | 0.7744 |
| 0.2556 | 88.4211 | 84000 | 0.2985 | 0.7754 | 0.8735 | 0.8735 | 0.8735 | 0.8679 | 0.8799 | 0.8726 | 0.7767 | 0.7676 | 0.7734 | 0.7839 |
| 0.2402 | 92.6316 | 88000 | 0.2976 | 0.7786 | 0.8755 | 0.8755 | 0.8786 | 0.8751 | 0.8694 | 0.8790 | 0.7766 | 0.7711 | 0.7810 | 0.7858 |
| 0.286 | 96.8421 | 92000 | 0.2998 | 0.7803 | 0.8766 | 0.8766 | 0.8733 | 0.8750 | 0.8828 | 0.8752 | 0.7833 | 0.7733 | 0.7770 | 0.7876 |
| 0.1853 | 101.0526 | 96000 | 0.2987 | 0.7843 | 0.8791 | 0.8791 | 0.8810 | 0.8816 | 0.8813 | 0.8726 | 0.7811 | 0.7754 | 0.7870 | 0.7935 |
| 0.2401 | 105.2632 | 100000 | 0.3093 | 0.7819 | 0.8776 | 0.8776 | 0.8761 | 0.8820 | 0.8791 | 0.8732 | 0.7818 | 0.7701 | 0.7864 | 0.7893 |
| 0.2546 | 109.4737 | 104000 | 0.3095 | 0.7846 | 0.8793 | 0.8793 | 0.8819 | 0.8829 | 0.8756 | 0.8766 | 0.7850 | 0.7731 | 0.7891 | 0.7912 |
| 0.3595 | 113.6842 | 108000 | 0.3096 | 0.7857 | 0.8800 | 0.8800 | 0.8777 | 0.8820 | 0.8795 | 0.8806 | 0.7858 | 0.7766 | 0.7892 | 0.7912 |
| 0.2484 | 117.8947 | 112000 | 0.3128 | 0.7857 | 0.8800 | 0.8799 | 0.8782 | 0.8842 | 0.8809 | 0.8765 | 0.7869 | 0.7762 | 0.7880 | 0.7916 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"0-0",
"0-90",
"90-0",
"90-90"
] |
jenniferlumeng/sagittal-b0-finetuned-segments |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sagittal-b0-finetuned-segments
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the jenniferlumeng/Sagittal dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0561
- Mean Iou: 0.3928
- Mean Accuracy: 0.5550
- Overall Accuracy: 0.6001
- Accuracy Background: nan
- Accuracy Olfactory bulb: 0.7548
- Accuracy Anterior olfactory nucleus: 0.2361
- Accuracy Basal ganglia: 0.5670
- Accuracy Cortex: 0.8443
- Accuracy Hypothalamus: 0.3500
- Accuracy Thalamus: 0.3216
- Accuracy Hippocampus: 0.4568
- Accuracy Midbrain: 0.7339
- Accuracy Cerebellum: 0.8112
- Accuracy Pons and medulla: 0.4748
- Iou Background: 0.0
- Iou Olfactory bulb: 0.5861
- Iou Anterior olfactory nucleus: 0.2110
- Iou Basal ganglia: 0.4574
- Iou Cortex: 0.6560
- Iou Hypothalamus: 0.3196
- Iou Thalamus: 0.3020
- Iou Hippocampus: 0.4364
- Iou Midbrain: 0.2970
- Iou Cerebellum: 0.6248
- Iou Pons and medulla: 0.4303
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Olfactory bulb | Accuracy Anterior olfactory nucleus | Accuracy Basal ganglia | Accuracy Cortex | Accuracy Hypothalamus | Accuracy Thalamus | Accuracy Hippocampus | Accuracy Midbrain | Accuracy Cerebellum | Accuracy Pons and medulla | Iou Background | Iou Olfactory bulb | Iou Anterior olfactory nucleus | Iou Basal ganglia | Iou Cortex | Iou Hypothalamus | Iou Thalamus | Iou Hippocampus | Iou Midbrain | Iou Cerebellum | Iou Pons and medulla |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------------:|:-----------------------------------:|:----------------------:|:---------------:|:---------------------:|:-----------------:|:--------------------:|:-----------------:|:-------------------:|:-------------------------:|:--------------:|:------------------:|:------------------------------:|:-----------------:|:----------:|:----------------:|:------------:|:---------------:|:------------:|:--------------:|:--------------------:|
| 1.9251 | 3.3333 | 20 | 2.3067 | 0.0795 | 0.1796 | 0.2634 | nan | 0.2399 | 0.0 | 0.0 | 0.9926 | 0.0554 | 0.0 | 0.0957 | 0.0 | 0.3004 | 0.1122 | 0.0 | 0.1546 | 0.0 | 0.0 | 0.2430 | 0.0548 | 0.0 | 0.0692 | 0.0 | 0.2769 | 0.0760 |
| 1.5773 | 6.6667 | 40 | 1.8563 | 0.1695 | 0.2863 | 0.3584 | nan | 0.4130 | 0.0 | 0.0340 | 0.8260 | 0.2004 | 0.1704 | 0.1409 | 0.4293 | 0.4427 | 0.2064 | 0.0 | 0.2795 | 0.0 | 0.0326 | 0.5150 | 0.1849 | 0.1148 | 0.1249 | 0.1197 | 0.3308 | 0.1619 |
| 1.3176 | 10.0 | 60 | 1.6904 | 0.2571 | 0.4066 | 0.4408 | nan | 0.6887 | 0.0 | 0.8044 | 0.7641 | 0.2494 | 0.1046 | 0.3613 | 0.2678 | 0.5775 | 0.2483 | 0.0 | 0.4874 | 0.0 | 0.1599 | 0.6414 | 0.2309 | 0.0979 | 0.3281 | 0.2014 | 0.4568 | 0.2239 |
| 1.3147 | 13.3333 | 80 | 1.4640 | 0.2817 | 0.4272 | 0.4773 | nan | 0.5533 | 0.0 | 0.4670 | 0.7442 | 0.2610 | 0.0613 | 0.4597 | 0.6721 | 0.6021 | 0.4509 | 0.0 | 0.3820 | 0.0 | 0.3403 | 0.6051 | 0.2377 | 0.0604 | 0.4191 | 0.2155 | 0.4920 | 0.3467 |
| 1.1284 | 16.6667 | 100 | 1.3582 | 0.2754 | 0.4165 | 0.4699 | nan | 0.6502 | 0.0036 | 0.5064 | 0.8678 | 0.2266 | 0.1352 | 0.4677 | 0.2578 | 0.5931 | 0.4562 | 0.0 | 0.5197 | 0.0036 | 0.3552 | 0.3330 | 0.2138 | 0.1324 | 0.4201 | 0.2191 | 0.4907 | 0.3414 |
| 1.1223 | 20.0 | 120 | 1.2891 | 0.2862 | 0.4221 | 0.4689 | nan | 0.6084 | 0.1211 | 0.3576 | 0.8239 | 0.2694 | 0.1435 | 0.4589 | 0.4033 | 0.5980 | 0.4369 | 0.0 | 0.4424 | 0.1137 | 0.3147 | 0.3247 | 0.2529 | 0.1306 | 0.4291 | 0.3035 | 0.4722 | 0.3644 |
| 1.0746 | 23.3333 | 140 | 1.2653 | 0.3122 | 0.4564 | 0.5064 | nan | 0.5209 | 0.0778 | 0.5140 | 0.7894 | 0.3297 | 0.1814 | 0.4610 | 0.7193 | 0.5320 | 0.4385 | 0.0 | 0.4051 | 0.0749 | 0.4002 | 0.5456 | 0.2988 | 0.1715 | 0.4247 | 0.2543 | 0.4697 | 0.3895 |
| 1.1374 | 26.6667 | 160 | 1.2168 | 0.3412 | 0.5006 | 0.5445 | nan | 0.7152 | 0.1454 | 0.5457 | 0.8318 | 0.3455 | 0.2429 | 0.4675 | 0.6257 | 0.6441 | 0.4424 | 0.0 | 0.5520 | 0.1348 | 0.3895 | 0.5159 | 0.3081 | 0.2310 | 0.4254 | 0.2683 | 0.5242 | 0.4046 |
| 0.8899 | 30.0 | 180 | 1.1327 | 0.3517 | 0.4974 | 0.5497 | nan | 0.5466 | 0.1382 | 0.4970 | 0.7591 | 0.3350 | 0.2808 | 0.4335 | 0.7737 | 0.7053 | 0.5047 | 0.0 | 0.4196 | 0.1286 | 0.4157 | 0.6405 | 0.3087 | 0.2560 | 0.4186 | 0.2777 | 0.5777 | 0.4251 |
| 0.822 | 33.3333 | 200 | 1.1223 | 0.3765 | 0.5371 | 0.5878 | nan | 0.7811 | 0.1923 | 0.5470 | 0.8318 | 0.3197 | 0.2845 | 0.4424 | 0.7182 | 0.6918 | 0.5623 | 0.0 | 0.5922 | 0.1736 | 0.4012 | 0.6352 | 0.3021 | 0.2676 | 0.4264 | 0.2960 | 0.5843 | 0.4632 |
| 1.3568 | 36.6667 | 220 | 1.0941 | 0.4136 | 0.5818 | 0.6319 | nan | 0.8136 | 0.2270 | 0.5548 | 0.8444 | 0.3559 | 0.7737 | 0.4477 | 0.4848 | 0.7838 | 0.5318 | 0.0 | 0.6210 | 0.2027 | 0.4504 | 0.6852 | 0.3245 | 0.4370 | 0.4324 | 0.2808 | 0.6397 | 0.4756 |
| 0.9664 | 40.0 | 240 | 1.0685 | 0.3843 | 0.5490 | 0.5962 | nan | 0.6310 | 0.2199 | 0.5723 | 0.8041 | 0.3211 | 0.8415 | 0.4591 | 0.3971 | 0.7908 | 0.4532 | 0.0 | 0.4706 | 0.1978 | 0.4563 | 0.6778 | 0.3002 | 0.4135 | 0.4364 | 0.2563 | 0.6112 | 0.4073 |
| 0.824 | 43.3333 | 260 | 1.0399 | 0.3955 | 0.5580 | 0.6051 | nan | 0.7486 | 0.2461 | 0.5592 | 0.8658 | 0.3265 | 0.5614 | 0.4605 | 0.5438 | 0.8004 | 0.4678 | 0.0 | 0.5838 | 0.2191 | 0.4581 | 0.6516 | 0.3069 | 0.3773 | 0.4371 | 0.2778 | 0.6095 | 0.4297 |
| 1.126 | 46.6667 | 280 | 1.0414 | 0.4036 | 0.5741 | 0.6232 | nan | 0.7521 | 0.2361 | 0.5644 | 0.8473 | 0.3231 | 0.8344 | 0.4528 | 0.4500 | 0.8079 | 0.4731 | 0.0 | 0.5863 | 0.2112 | 0.4592 | 0.6156 | 0.3047 | 0.4545 | 0.4341 | 0.3238 | 0.6222 | 0.4280 |
| 0.8935 | 50.0 | 300 | 1.0561 | 0.3928 | 0.5550 | 0.6001 | nan | 0.7548 | 0.2361 | 0.5670 | 0.8443 | 0.3500 | 0.3216 | 0.4568 | 0.7339 | 0.8112 | 0.4748 | 0.0 | 0.5861 | 0.2110 | 0.4574 | 0.6560 | 0.3196 | 0.3020 | 0.4364 | 0.2970 | 0.6248 | 0.4303 |
### Framework versions
- Transformers 4.52.2
- Pytorch 2.6.0+cu124
- Datasets 2.16.1
- Tokenizers 0.21.1
| [
"background",
"olfactory bulb",
"anterior olfactory nucleus",
"basal ganglia",
"cortex",
"hypothalamus",
"thalamus",
"hippocampus",
"midbrain",
"cerebellum",
"pons and medulla"
] |
jenniferlumeng/sagittal-b4-finetuned-segments |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sagittal-b4-finetuned-segments
This model is a fine-tuned version of [nvidia/mit-b4](https://huggingface.co/nvidia/mit-b4) on the jenniferlumeng/Sagittal dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5610
- Mean Iou: 0.6387
- Mean Accuracy: 0.7597
- Overall Accuracy: 0.7684
- Accuracy Background: nan
- Accuracy Olfactory bulb: 0.7170
- Accuracy Anterior olfactory nucleus: 0.6456
- Accuracy Basal ganglia: 0.7788
- Accuracy Cortex: 0.7965
- Accuracy Hypothalamus: 0.6187
- Accuracy Thalamus: 0.7553
- Accuracy Hippocampus: 0.8524
- Accuracy Midbrain: 0.8602
- Accuracy Cerebellum: 0.7899
- Accuracy Pons and medulla: 0.7831
- Iou Background: 0.0
- Iou Olfactory bulb: 0.6979
- Iou Anterior olfactory nucleus: 0.5897
- Iou Basal ganglia: 0.7036
- Iou Cortex: 0.7569
- Iou Hypothalamus: 0.5348
- Iou Thalamus: 0.7058
- Iou Hippocampus: 0.8192
- Iou Midbrain: 0.7187
- Iou Cerebellum: 0.7689
- Iou Pons and medulla: 0.7295
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Olfactory bulb | Accuracy Anterior olfactory nucleus | Accuracy Basal ganglia | Accuracy Cortex | Accuracy Hypothalamus | Accuracy Thalamus | Accuracy Hippocampus | Accuracy Midbrain | Accuracy Cerebellum | Accuracy Pons and medulla | Iou Background | Iou Olfactory bulb | Iou Anterior olfactory nucleus | Iou Basal ganglia | Iou Cortex | Iou Hypothalamus | Iou Thalamus | Iou Hippocampus | Iou Midbrain | Iou Cerebellum | Iou Pons and medulla |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------------:|:-----------------------------------:|:----------------------:|:---------------:|:---------------------:|:-----------------:|:--------------------:|:-----------------:|:-------------------:|:-------------------------:|:--------------:|:------------------:|:------------------------------:|:-----------------:|:----------:|:----------------:|:------------:|:---------------:|:------------:|:--------------:|:--------------------:|
| 1.1622 | 3.3333 | 20 | 1.5275 | 0.2290 | 0.2936 | 0.3032 | nan | 0.3546 | 0.0661 | 0.3629 | 0.4619 | 0.3534 | 0.0245 | 0.4997 | 0.3121 | 0.3472 | 0.1532 | 0.0 | 0.3179 | 0.0654 | 0.2793 | 0.3896 | 0.3249 | 0.0232 | 0.4386 | 0.1852 | 0.3458 | 0.1488 |
| 0.8211 | 6.6667 | 40 | 1.0160 | 0.3284 | 0.4622 | 0.4886 | nan | 0.4025 | 0.2457 | 0.4573 | 0.7211 | 0.3213 | 0.7259 | 0.5201 | 0.4266 | 0.3944 | 0.4069 | 0.0 | 0.3734 | 0.2422 | 0.3826 | 0.4803 | 0.2946 | 0.2820 | 0.4623 | 0.3025 | 0.3941 | 0.3986 |
| 0.2823 | 10.0 | 60 | 0.9503 | 0.4263 | 0.5468 | 0.5681 | nan | 0.4108 | 0.3905 | 0.5659 | 0.6721 | 0.4120 | 0.7734 | 0.5256 | 0.6114 | 0.6022 | 0.5039 | 0.0 | 0.4030 | 0.3792 | 0.4867 | 0.6002 | 0.3712 | 0.5518 | 0.4860 | 0.4593 | 0.4789 | 0.4733 |
| 0.4346 | 13.3333 | 80 | 0.6683 | 0.5384 | 0.6798 | 0.7221 | nan | 0.5424 | 0.5376 | 0.7562 | 0.8557 | 0.5587 | 0.8000 | 0.5245 | 0.7792 | 0.7022 | 0.7419 | 0.0 | 0.5198 | 0.5055 | 0.6668 | 0.7488 | 0.4901 | 0.6675 | 0.4835 | 0.5702 | 0.6572 | 0.6131 |
| 0.1348 | 16.6667 | 100 | 0.5909 | 0.5275 | 0.6836 | 0.7131 | nan | 0.5024 | 0.5379 | 0.6884 | 0.7820 | 0.6158 | 0.8733 | 0.5253 | 0.7972 | 0.8618 | 0.6525 | 0.0 | 0.4253 | 0.5029 | 0.5920 | 0.7553 | 0.5203 | 0.5693 | 0.4756 | 0.5773 | 0.7332 | 0.6511 |
| 0.1317 | 20.0 | 120 | 0.5279 | 0.6000 | 0.7499 | 0.7699 | nan | 0.6679 | 0.6691 | 0.6804 | 0.9212 | 0.6790 | 0.7882 | 0.7477 | 0.8093 | 0.8195 | 0.7169 | 0.0 | 0.6282 | 0.6283 | 0.6090 | 0.7796 | 0.6010 | 0.5967 | 0.6350 | 0.6556 | 0.7736 | 0.6933 |
| 0.2667 | 23.3333 | 140 | 0.6451 | 0.5482 | 0.6840 | 0.6961 | nan | 0.6738 | 0.5915 | 0.6175 | 0.7717 | 0.6215 | 0.7162 | 0.7077 | 0.7127 | 0.7174 | 0.7097 | 0.0 | 0.6220 | 0.5489 | 0.5643 | 0.7119 | 0.5204 | 0.5866 | 0.6468 | 0.5720 | 0.6582 | 0.5986 |
| 0.3673 | 26.6667 | 160 | 0.5395 | 0.5843 | 0.7265 | 0.7280 | nan | 0.7682 | 0.6859 | 0.6984 | 0.8040 | 0.6214 | 0.7752 | 0.8302 | 0.7929 | 0.5215 | 0.7669 | 0.0 | 0.7397 | 0.6224 | 0.6607 | 0.6138 | 0.5355 | 0.6739 | 0.6855 | 0.6733 | 0.5017 | 0.7208 |
| 0.345 | 30.0 | 180 | 0.4865 | 0.6101 | 0.7534 | 0.7675 | nan | 0.7244 | 0.7111 | 0.7634 | 0.9073 | 0.7027 | 0.7449 | 0.7589 | 0.8557 | 0.6596 | 0.7061 | 0.0 | 0.7089 | 0.6334 | 0.6898 | 0.6957 | 0.5837 | 0.6786 | 0.6832 | 0.7091 | 0.6404 | 0.6886 |
| 0.1892 | 33.3333 | 200 | 0.5088 | 0.6134 | 0.7589 | 0.7739 | nan | 0.6971 | 0.6785 | 0.7077 | 0.8255 | 0.6950 | 0.7285 | 0.8019 | 0.7823 | 0.8302 | 0.8419 | 0.0 | 0.6760 | 0.6139 | 0.6244 | 0.7471 | 0.5948 | 0.6243 | 0.7012 | 0.6364 | 0.7359 | 0.7934 |
| 0.283 | 36.6667 | 220 | 0.5012 | 0.6032 | 0.7387 | 0.7525 | nan | 0.6736 | 0.6548 | 0.6843 | 0.8329 | 0.6138 | 0.7489 | 0.8097 | 0.7708 | 0.8219 | 0.7763 | 0.0 | 0.6511 | 0.5898 | 0.5952 | 0.7460 | 0.5490 | 0.6433 | 0.7184 | 0.6573 | 0.7478 | 0.7373 |
| 0.3255 | 40.0 | 240 | 0.4538 | 0.6439 | 0.7751 | 0.7926 | nan | 0.6323 | 0.6450 | 0.7895 | 0.8253 | 0.6834 | 0.8150 | 0.8167 | 0.8587 | 0.8580 | 0.8274 | 0.0 | 0.6155 | 0.5910 | 0.6879 | 0.7771 | 0.5999 | 0.7198 | 0.7293 | 0.7568 | 0.8085 | 0.7969 |
| 0.148 | 43.3333 | 260 | 0.5867 | 0.5934 | 0.7219 | 0.7242 | nan | 0.5819 | 0.6130 | 0.7968 | 0.7211 | 0.6326 | 0.7201 | 0.8921 | 0.7792 | 0.7586 | 0.7235 | 0.0 | 0.5698 | 0.5527 | 0.7039 | 0.6853 | 0.5568 | 0.6735 | 0.7675 | 0.6665 | 0.6775 | 0.6738 |
| 0.2442 | 46.6667 | 280 | 0.5438 | 0.6123 | 0.7363 | 0.7502 | nan | 0.6327 | 0.6296 | 0.7893 | 0.7839 | 0.5792 | 0.7350 | 0.8244 | 0.8132 | 0.7980 | 0.7780 | 0.0 | 0.6221 | 0.5730 | 0.6917 | 0.7543 | 0.5004 | 0.6962 | 0.7619 | 0.6561 | 0.7618 | 0.7179 |
| 0.1645 | 50.0 | 300 | 0.5079 | 0.6323 | 0.7651 | 0.7711 | nan | 0.7346 | 0.6775 | 0.7749 | 0.7836 | 0.6132 | 0.7336 | 0.8661 | 0.8496 | 0.8220 | 0.7960 | 0.0 | 0.6891 | 0.6033 | 0.7091 | 0.7671 | 0.5359 | 0.6642 | 0.7340 | 0.7250 | 0.7986 | 0.7295 |
| 0.2699 | 53.3333 | 320 | 0.5663 | 0.6069 | 0.7401 | 0.7475 | nan | 0.7376 | 0.6604 | 0.7358 | 0.8071 | 0.6238 | 0.7225 | 0.8290 | 0.8199 | 0.7012 | 0.7635 | 0.0 | 0.7229 | 0.6041 | 0.6395 | 0.7020 | 0.5321 | 0.6502 | 0.7546 | 0.6855 | 0.6720 | 0.7131 |
| 0.2053 | 56.6667 | 340 | 0.5013 | 0.6341 | 0.7684 | 0.7750 | nan | 0.7147 | 0.6551 | 0.7489 | 0.8326 | 0.6458 | 0.8202 | 0.8792 | 0.8516 | 0.7662 | 0.7696 | 0.0 | 0.6918 | 0.5916 | 0.6512 | 0.7922 | 0.5612 | 0.7269 | 0.7414 | 0.7425 | 0.7519 | 0.7245 |
| 0.2427 | 60.0 | 360 | 0.4900 | 0.6275 | 0.7673 | 0.7721 | nan | 0.7584 | 0.7267 | 0.7405 | 0.8320 | 0.6785 | 0.7632 | 0.8677 | 0.8152 | 0.6697 | 0.8215 | 0.0 | 0.7289 | 0.6565 | 0.6647 | 0.7254 | 0.5795 | 0.6798 | 0.7799 | 0.6798 | 0.6329 | 0.7752 |
| 0.0668 | 63.3333 | 380 | 0.4845 | 0.6435 | 0.7722 | 0.7766 | nan | 0.7479 | 0.7064 | 0.7754 | 0.7830 | 0.6316 | 0.7340 | 0.8832 | 0.8429 | 0.7855 | 0.8320 | 0.0 | 0.7189 | 0.6336 | 0.6988 | 0.7412 | 0.5582 | 0.6887 | 0.8092 | 0.7069 | 0.7433 | 0.7797 |
| 0.1278 | 66.6667 | 400 | 0.5318 | 0.6220 | 0.7447 | 0.7560 | nan | 0.7063 | 0.6682 | 0.7959 | 0.7900 | 0.6057 | 0.7272 | 0.8067 | 0.8161 | 0.7418 | 0.7891 | 0.0 | 0.6939 | 0.6056 | 0.7106 | 0.7178 | 0.5353 | 0.7047 | 0.7581 | 0.6757 | 0.7074 | 0.7330 |
| 0.1184 | 70.0 | 420 | 0.5153 | 0.6434 | 0.7695 | 0.7778 | nan | 0.7200 | 0.6898 | 0.7627 | 0.8246 | 0.6589 | 0.7738 | 0.8395 | 0.8716 | 0.7698 | 0.7847 | 0.0 | 0.6858 | 0.6246 | 0.7008 | 0.7713 | 0.5730 | 0.6913 | 0.8064 | 0.7338 | 0.7483 | 0.7425 |
| 0.1317 | 73.3333 | 440 | 0.5403 | 0.6346 | 0.7586 | 0.7668 | nan | 0.7143 | 0.6677 | 0.7672 | 0.7990 | 0.5974 | 0.7354 | 0.8611 | 0.8529 | 0.8030 | 0.7876 | 0.0 | 0.6901 | 0.6051 | 0.6957 | 0.7751 | 0.5214 | 0.6903 | 0.8054 | 0.7020 | 0.7681 | 0.7279 |
| 0.0959 | 76.6667 | 460 | 0.5506 | 0.6325 | 0.7529 | 0.7596 | nan | 0.7081 | 0.6401 | 0.7706 | 0.7878 | 0.6339 | 0.7571 | 0.8553 | 0.8437 | 0.7636 | 0.7686 | 0.0 | 0.6859 | 0.5831 | 0.6937 | 0.7470 | 0.5459 | 0.7144 | 0.8041 | 0.7182 | 0.7359 | 0.7289 |
| 0.1181 | 80.0 | 480 | 0.5810 | 0.6227 | 0.7489 | 0.7528 | nan | 0.7194 | 0.6986 | 0.7478 | 0.7786 | 0.6016 | 0.7453 | 0.8501 | 0.8435 | 0.7140 | 0.7897 | 0.0 | 0.7035 | 0.6368 | 0.6793 | 0.7059 | 0.5201 | 0.6781 | 0.7990 | 0.7002 | 0.6957 | 0.7306 |
| 0.1272 | 83.3333 | 500 | 0.5927 | 0.6213 | 0.7406 | 0.7501 | nan | 0.7056 | 0.6515 | 0.7716 | 0.7891 | 0.6042 | 0.7289 | 0.8221 | 0.8266 | 0.7345 | 0.7725 | 0.0 | 0.6898 | 0.5965 | 0.7000 | 0.7362 | 0.5184 | 0.7017 | 0.7840 | 0.6860 | 0.7035 | 0.7186 |
| 0.1653 | 86.6667 | 520 | 0.5653 | 0.6368 | 0.7586 | 0.7645 | nan | 0.7195 | 0.6718 | 0.7697 | 0.7843 | 0.6201 | 0.7360 | 0.8479 | 0.8640 | 0.8044 | 0.7683 | 0.0 | 0.7011 | 0.6056 | 0.7009 | 0.7658 | 0.5286 | 0.7009 | 0.8013 | 0.7081 | 0.7787 | 0.7136 |
| 0.1633 | 90.0 | 540 | 0.5539 | 0.6421 | 0.7641 | 0.7693 | nan | 0.7257 | 0.6989 | 0.7533 | 0.8062 | 0.6249 | 0.7595 | 0.8532 | 0.8622 | 0.7844 | 0.7728 | 0.0 | 0.7107 | 0.6324 | 0.6835 | 0.7668 | 0.5405 | 0.6966 | 0.8200 | 0.7257 | 0.7640 | 0.7232 |
| 0.0863 | 93.3333 | 560 | 0.5737 | 0.6348 | 0.7544 | 0.7607 | nan | 0.7237 | 0.6606 | 0.7705 | 0.7775 | 0.6186 | 0.7401 | 0.8493 | 0.8438 | 0.7727 | 0.7868 | 0.0 | 0.7032 | 0.5988 | 0.7055 | 0.7366 | 0.5337 | 0.7141 | 0.8136 | 0.7060 | 0.7485 | 0.7230 |
| 0.072 | 96.6667 | 580 | 0.5544 | 0.6400 | 0.7616 | 0.7691 | nan | 0.7153 | 0.6546 | 0.7755 | 0.7964 | 0.6272 | 0.7607 | 0.8605 | 0.8571 | 0.7832 | 0.7854 | 0.0 | 0.6956 | 0.5972 | 0.7029 | 0.7550 | 0.5402 | 0.7088 | 0.8237 | 0.7215 | 0.7640 | 0.7309 |
| 0.1014 | 100.0 | 600 | 0.5610 | 0.6387 | 0.7597 | 0.7684 | nan | 0.7170 | 0.6456 | 0.7788 | 0.7965 | 0.6187 | 0.7553 | 0.8524 | 0.8602 | 0.7899 | 0.7831 | 0.0 | 0.6979 | 0.5897 | 0.7036 | 0.7569 | 0.5348 | 0.7058 | 0.8192 | 0.7187 | 0.7689 | 0.7295 |
### Framework versions
- Transformers 4.52.2
- Pytorch 2.6.0+cu124
- Datasets 2.16.1
- Tokenizers 0.21.1
| [
"background",
"olfactory bulb",
"anterior olfactory nucleus",
"basal ganglia",
"cortex",
"hypothalamus",
"thalamus",
"hippocampus",
"midbrain",
"cerebellum",
"pons and medulla"
] |
fuji12345/segformer-finetuned-sidewalk-10k-steps |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-finetuned-sidewalk-10k-steps
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the fuji12345/sample-segmentation_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1181
- Mean Iou: 0.0
- Mean Accuracy: nan
- Overall Accuracy: nan
- Accuracy Hallucination: nan
- Accuracy Normal: nan
- Iou Hallucination: 0.0
- Iou Normal: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Hallucination | Accuracy Normal | Iou Hallucination | Iou Normal |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:----------------------:|:---------------:|:-----------------:|:----------:|
| No log | 1.0 | 21 | 0.4468 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| No log | 2.0 | 42 | 0.2196 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| No log | 3.0 | 63 | 0.2136 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| No log | 4.0 | 84 | 0.1816 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| 0.3217 | 5.0 | 105 | 0.1449 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| 0.3217 | 6.0 | 126 | 0.1380 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| 0.3217 | 7.0 | 147 | 0.1239 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| 0.3217 | 8.0 | 168 | 0.1247 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| 0.3217 | 9.0 | 189 | 0.1163 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
| 0.1596 | 9.5238 | 200 | 0.1181 | 0.0 | nan | nan | nan | nan | 0.0 | 0.0 |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"hallucination",
"normal"
] |
jenniferlumeng/sagittal-b4-v11-finetuned-segments |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sagittal-b4-v11-finetuned-segments
This model is a fine-tuned version of [nvidia/mit-b4](https://huggingface.co/nvidia/mit-b4) on the jenniferlumeng/MiceSagittal dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2102
- Mean Iou: 0.7660
- Mean Accuracy: 0.8966
- Overall Accuracy: 0.9025
- Accuracy Background: nan
- Accuracy Olfactory bulb: 0.9311
- Accuracy Anterior olfactory nucleus: 0.8576
- Accuracy Basal ganglia: 0.8881
- Accuracy Cortex: 0.9547
- Accuracy Hypothalamus: 0.7991
- Accuracy Thalamus: 0.8460
- Accuracy Hippocampus: 0.9531
- Accuracy Midbrain: 0.8908
- Accuracy Cerebellum: 0.9482
- Accuracy Pons and medulla: 0.8971
- Iou Background: 0.0
- Iou Olfactory bulb: 0.9039
- Iou Anterior olfactory nucleus: 0.7535
- Iou Basal ganglia: 0.8169
- Iou Cortex: 0.9464
- Iou Hypothalamus: 0.6482
- Iou Thalamus: 0.8106
- Iou Hippocampus: 0.9292
- Iou Midbrain: 0.8164
- Iou Cerebellum: 0.9371
- Iou Pons and medulla: 0.8635
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Olfactory bulb | Accuracy Anterior olfactory nucleus | Accuracy Basal ganglia | Accuracy Cortex | Accuracy Hypothalamus | Accuracy Thalamus | Accuracy Hippocampus | Accuracy Midbrain | Accuracy Cerebellum | Accuracy Pons and medulla | Iou Background | Iou Olfactory bulb | Iou Anterior olfactory nucleus | Iou Basal ganglia | Iou Cortex | Iou Hypothalamus | Iou Thalamus | Iou Hippocampus | Iou Midbrain | Iou Cerebellum | Iou Pons and medulla |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-----------------------:|:-----------------------------------:|:----------------------:|:---------------:|:---------------------:|:-----------------:|:--------------------:|:-----------------:|:-------------------:|:-------------------------:|:--------------:|:------------------:|:------------------------------:|:-----------------:|:----------:|:----------------:|:------------:|:---------------:|:------------:|:--------------:|:--------------------:|
| 0.679 | 2.8571 | 20 | 0.9745 | 0.6058 | 0.7741 | 0.8090 | nan | 0.8511 | 0.3173 | 0.7203 | 0.9142 | 0.7325 | 0.7090 | 0.9034 | 0.7282 | 0.9504 | 0.9147 | 0.0 | 0.7558 | 0.2904 | 0.5982 | 0.7974 | 0.6685 | 0.5337 | 0.7721 | 0.5572 | 0.9323 | 0.7581 |
| 0.3016 | 5.7143 | 40 | 0.3587 | 0.7273 | 0.8768 | 0.8823 | nan | 0.9822 | 0.8005 | 0.9478 | 0.9440 | 0.6822 | 0.8097 | 0.9763 | 0.7227 | 0.9769 | 0.9253 | 0.0 | 0.9301 | 0.7134 | 0.7860 | 0.9003 | 0.6332 | 0.7492 | 0.8546 | 0.6615 | 0.9086 | 0.8635 |
| 0.1624 | 8.5714 | 60 | 0.2448 | 0.7492 | 0.8943 | 0.8995 | nan | 0.9893 | 0.8487 | 0.8965 | 0.9607 | 0.7695 | 0.7685 | 0.9800 | 0.7855 | 0.9873 | 0.9567 | 0.0 | 0.9430 | 0.6882 | 0.8037 | 0.9349 | 0.6812 | 0.7179 | 0.9229 | 0.7106 | 0.9520 | 0.8865 |
| 0.1345 | 11.4286 | 80 | 0.2230 | 0.7558 | 0.8896 | 0.9019 | nan | 0.9778 | 0.8016 | 0.9463 | 0.9728 | 0.6755 | 0.8240 | 0.9634 | 0.8346 | 0.9680 | 0.9321 | 0.0 | 0.9270 | 0.7009 | 0.8164 | 0.9578 | 0.6296 | 0.7628 | 0.9486 | 0.7469 | 0.9470 | 0.8766 |
| 0.0976 | 14.2857 | 100 | 0.1891 | 0.7781 | 0.9114 | 0.9200 | nan | 0.9805 | 0.8293 | 0.9469 | 0.9676 | 0.7870 | 0.8236 | 0.9832 | 0.8752 | 0.9811 | 0.9393 | 0.0 | 0.9437 | 0.7270 | 0.8437 | 0.9653 | 0.7230 | 0.7625 | 0.9627 | 0.7735 | 0.9634 | 0.8941 |
| 0.0881 | 17.1429 | 120 | 0.1893 | 0.7803 | 0.9135 | 0.9214 | nan | 0.9807 | 0.8364 | 0.9315 | 0.9758 | 0.7977 | 0.8714 | 0.9677 | 0.8514 | 0.9779 | 0.9439 | 0.0 | 0.9431 | 0.7231 | 0.8374 | 0.9699 | 0.7346 | 0.7965 | 0.9561 | 0.7745 | 0.9599 | 0.8885 |
| 0.0873 | 20.0 | 140 | 0.1849 | 0.7739 | 0.9054 | 0.9158 | nan | 0.9813 | 0.8181 | 0.9328 | 0.9729 | 0.7932 | 0.8266 | 0.9580 | 0.8558 | 0.9728 | 0.9421 | 0.0 | 0.9382 | 0.7194 | 0.8417 | 0.9664 | 0.7251 | 0.7666 | 0.9478 | 0.7653 | 0.9597 | 0.8831 |
| 0.076 | 22.8571 | 160 | 0.1871 | 0.7765 | 0.9086 | 0.9192 | nan | 0.9823 | 0.8126 | 0.9226 | 0.9777 | 0.8067 | 0.8463 | 0.9570 | 0.8531 | 0.9776 | 0.9498 | 0.0 | 0.9369 | 0.7113 | 0.8346 | 0.9719 | 0.7353 | 0.7829 | 0.9478 | 0.7706 | 0.9621 | 0.8881 |
| 0.0948 | 25.7143 | 180 | 0.1850 | 0.7727 | 0.9047 | 0.9139 | nan | 0.9730 | 0.8330 | 0.9127 | 0.9667 | 0.8194 | 0.8083 | 0.9538 | 0.8653 | 0.9759 | 0.9384 | 0.0 | 0.9374 | 0.7165 | 0.8343 | 0.9599 | 0.7339 | 0.7588 | 0.9442 | 0.7697 | 0.9559 | 0.8895 |
| 0.0646 | 28.5714 | 200 | 0.1813 | 0.7756 | 0.9067 | 0.9137 | nan | 0.9731 | 0.8354 | 0.9258 | 0.9588 | 0.7750 | 0.8675 | 0.9635 | 0.8512 | 0.9749 | 0.9422 | 0.0 | 0.9414 | 0.7175 | 0.8305 | 0.9569 | 0.7235 | 0.7973 | 0.9530 | 0.7725 | 0.9562 | 0.8827 |
| 0.0636 | 31.4286 | 220 | 0.1894 | 0.7729 | 0.9034 | 0.9130 | nan | 0.9754 | 0.8133 | 0.9279 | 0.9673 | 0.7529 | 0.8737 | 0.9581 | 0.8483 | 0.9725 | 0.9448 | 0.0 | 0.9383 | 0.7088 | 0.8227 | 0.9625 | 0.7090 | 0.7979 | 0.9488 | 0.7750 | 0.9556 | 0.8831 |
| 0.0688 | 34.2857 | 240 | 0.1906 | 0.7761 | 0.9079 | 0.9174 | nan | 0.9803 | 0.8062 | 0.9240 | 0.9747 | 0.7894 | 0.8718 | 0.9650 | 0.8394 | 0.9792 | 0.9491 | 0.0 | 0.9350 | 0.7046 | 0.8302 | 0.9689 | 0.7313 | 0.7982 | 0.9527 | 0.7717 | 0.9579 | 0.8864 |
| 0.0635 | 37.1429 | 260 | 0.1874 | 0.7783 | 0.9093 | 0.9189 | nan | 0.9814 | 0.8162 | 0.9153 | 0.9752 | 0.8063 | 0.8667 | 0.9536 | 0.8567 | 0.9783 | 0.9436 | 0.0 | 0.9397 | 0.7071 | 0.8299 | 0.9683 | 0.7441 | 0.7990 | 0.9457 | 0.7800 | 0.9584 | 0.8887 |
| 0.0693 | 40.0 | 280 | 0.1869 | 0.7785 | 0.9094 | 0.9180 | nan | 0.9779 | 0.8151 | 0.9170 | 0.9727 | 0.8047 | 0.8740 | 0.9628 | 0.8455 | 0.9772 | 0.9471 | 0.0 | 0.9398 | 0.7103 | 0.8302 | 0.9678 | 0.7427 | 0.8030 | 0.9514 | 0.7747 | 0.9586 | 0.8853 |
| 0.0585 | 42.8571 | 300 | 0.1889 | 0.7770 | 0.9081 | 0.9168 | nan | 0.9778 | 0.8248 | 0.9108 | 0.9720 | 0.8021 | 0.8638 | 0.9564 | 0.8545 | 0.9768 | 0.9426 | 0.0 | 0.9389 | 0.7094 | 0.8293 | 0.9663 | 0.7435 | 0.7943 | 0.9470 | 0.7763 | 0.9559 | 0.8859 |
| 0.0783 | 45.7143 | 320 | 0.1880 | 0.7772 | 0.9078 | 0.9166 | nan | 0.9768 | 0.8193 | 0.9141 | 0.9708 | 0.7949 | 0.8700 | 0.9590 | 0.8567 | 0.9741 | 0.9426 | 0.0 | 0.9385 | 0.7093 | 0.8312 | 0.9651 | 0.7414 | 0.7979 | 0.9484 | 0.7769 | 0.9560 | 0.8842 |
| 0.0588 | 48.5714 | 340 | 0.1859 | 0.7774 | 0.9076 | 0.9167 | nan | 0.9767 | 0.8169 | 0.9159 | 0.9709 | 0.8001 | 0.8695 | 0.9546 | 0.8556 | 0.9744 | 0.9415 | 0.0 | 0.9380 | 0.7104 | 0.8323 | 0.9656 | 0.7425 | 0.7984 | 0.9455 | 0.7779 | 0.9563 | 0.8848 |
### Framework versions
- Transformers 4.52.2
- Pytorch 2.6.0+cu124
- Datasets 2.16.1
- Tokenizers 0.21.1
| [
"background",
"olfactory bulb",
"anterior olfactory nucleus",
"basal ganglia",
"cortex",
"hypothalamus",
"thalamus",
"hippocampus",
"midbrain",
"cerebellum",
"pons and medulla"
] |
fuji12345/segformer-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-finetuned
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the fuji12345/t2i-seg dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0002
- Mean Iou: 0.0
- Mean Accuracy: nan
- Overall Accuracy: nan
- Accuracy Hallucination: nan
- Accuracy Normal: nan
- Iou Hallucination: 0.0
- Iou Normal: nan
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- num_epochs: 30.0
### Training results
| Training Loss | Epoch | Step | Accuracy Hallucination | Accuracy Normal | Iou Hallucination | Iou Normal | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy |
|:-------------:|:------:|:----:|:----------------------:|:---------------:|:-----------------:|:----------:|:---------------:|:-------------:|:--------:|:----------------:|
| No log | 1.0 | 21 | nan | nan | 0.0 | 0.0 | 0.4468 | nan | 0.0 | nan |
| No log | 2.0 | 42 | nan | nan | 0.0 | 0.0 | 0.2196 | nan | 0.0 | nan |
| No log | 3.0 | 63 | nan | nan | 0.0 | 0.0 | 0.2136 | nan | 0.0 | nan |
| No log | 4.0 | 84 | nan | nan | 0.0 | 0.0 | 0.1816 | nan | 0.0 | nan |
| 0.3217 | 5.0 | 105 | nan | nan | 0.0 | 0.0 | 0.1449 | nan | 0.0 | nan |
| 0.3217 | 6.0 | 126 | nan | nan | 0.0 | 0.0 | 0.1380 | nan | 0.0 | nan |
| 0.3217 | 7.0 | 147 | nan | nan | 0.0 | 0.0 | 0.1239 | nan | 0.0 | nan |
| 0.3217 | 8.0 | 168 | nan | nan | 0.0 | 0.0 | 0.1247 | nan | 0.0 | nan |
| 0.3217 | 9.0 | 189 | nan | nan | 0.0 | 0.0 | 0.1163 | nan | 0.0 | nan |
| 0.1596 | 9.5238 | 200 | nan | nan | 0.0 | 0.0 | 0.1181 | nan | 0.0 | nan |
| 0.0372 | 2.0 | 400 | nan | nan | 0.0 | nan | 0.0207 | nan | 0.0 | nan |
| 0.014 | 3.0 | 600 | 0.0095 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0078 | 4.0 | 800 | 0.0060 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0052 | 5.0 | 1000 | 0.0039 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0036 | 6.0 | 1200 | 0.0029 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0028 | 7.0 | 1400 | 0.0022 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0022 | 8.0 | 1600 | 0.0018 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0017 | 9.0 | 1800 | 0.0015 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0014 | 10.0 | 2000 | 0.0012 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0012 | 11.0 | 2200 | 0.0010 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0011 | 12.0 | 2400 | 0.0009 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0009 | 13.0 | 2600 | 0.0008 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0008 | 14.0 | 2800 | 0.0007 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0007 | 15.0 | 3000 | 0.0006 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0006 | 16.0 | 3200 | 0.0005 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0006 | 17.0 | 3400 | 0.0005 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0005 | 18.0 | 3600 | 0.0004 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0005 | 19.0 | 3800 | 0.0004 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0004 | 20.0 | 4000 | 0.0004 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0004 | 21.0 | 4200 | 0.0003 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0004 | 22.0 | 4400 | 0.0003 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 23.0 | 4600 | 0.0003 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 24.0 | 4800 | 0.0003 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 25.0 | 5000 | 0.0003 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 26.0 | 5200 | 0.0002 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 27.0 | 5400 | 0.0002 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 28.0 | 5600 | 0.0002 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 29.0 | 5800 | 0.0002 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
| 0.0003 | 30.0 | 6000 | 0.0002 | 0.0 | nan | nan | nan | nan | 0.0 | nan |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"hallucination",
"normal"
] |
fuji12345/segformer-finetuned-e300 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-finetuned-e300
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the fuji12345/t2i-seg-01 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4916
- Mean Iou: 0.1667
- Mean Accuracy: 0.3335
- Overall Accuracy: 0.3335
- Accuracy Normal: nan
- Accuracy Hallucination: 0.3335
- Iou Normal: 0.0
- Iou Hallucination: 0.3335
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- num_epochs: 300.0
### Training results
| Training Loss | Epoch | Step | Accuracy Hallucination | Accuracy Normal | Iou Hallucination | Iou Normal | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy |
|:-------------:|:------:|:-----:|:----------------------:|:---------------:|:-----------------:|:----------:|:---------------:|:-------------:|:--------:|:----------------:|
| No log | 1.0 | 21 | nan | nan | 0.0 | 0.0 | 0.4468 | nan | 0.0 | nan |
| No log | 2.0 | 42 | nan | nan | 0.0 | 0.0 | 0.2196 | nan | 0.0 | nan |
| No log | 3.0 | 63 | nan | nan | 0.0 | 0.0 | 0.2136 | nan | 0.0 | nan |
| No log | 4.0 | 84 | nan | nan | 0.0 | 0.0 | 0.1816 | nan | 0.0 | nan |
| 0.3217 | 5.0 | 105 | nan | nan | 0.0 | 0.0 | 0.1449 | nan | 0.0 | nan |
| 0.3217 | 6.0 | 126 | nan | nan | 0.0 | 0.0 | 0.1380 | nan | 0.0 | nan |
| 0.3217 | 7.0 | 147 | nan | nan | 0.0 | 0.0 | 0.1239 | nan | 0.0 | nan |
| 0.3217 | 8.0 | 168 | nan | nan | 0.0 | 0.0 | 0.1247 | nan | 0.0 | nan |
| 0.3217 | 9.0 | 189 | nan | nan | 0.0 | 0.0 | 0.1163 | nan | 0.0 | nan |
| 0.1596 | 9.5238 | 200 | nan | nan | 0.0 | 0.0 | 0.1181 | nan | 0.0 | nan |
| 0.0372 | 2.0 | 400 | nan | nan | 0.0 | nan | 0.0207 | nan | 0.0 | nan |
| 0.014 | 3.0 | 600 | nan | nan | 0.0 | nan | 0.0095 | nan | 0.0 | nan |
| 0.0078 | 4.0 | 800 | nan | nan | 0.0 | nan | 0.0060 | nan | 0.0 | nan |
| 0.0052 | 5.0 | 1000 | nan | nan | 0.0 | nan | 0.0039 | nan | 0.0 | nan |
| 0.0036 | 6.0 | 1200 | nan | nan | 0.0 | nan | 0.0029 | nan | 0.0 | nan |
| 0.0028 | 7.0 | 1400 | nan | nan | 0.0 | nan | 0.0022 | nan | 0.0 | nan |
| 0.0022 | 8.0 | 1600 | nan | nan | 0.0 | nan | 0.0018 | nan | 0.0 | nan |
| 0.0017 | 9.0 | 1800 | nan | nan | 0.0 | nan | 0.0015 | nan | 0.0 | nan |
| 0.0014 | 10.0 | 2000 | nan | nan | 0.0 | nan | 0.0012 | nan | 0.0 | nan |
| 0.0012 | 11.0 | 2200 | nan | nan | 0.0 | nan | 0.0010 | nan | 0.0 | nan |
| 0.0011 | 12.0 | 2400 | nan | nan | 0.0 | nan | 0.0009 | nan | 0.0 | nan |
| 0.0009 | 13.0 | 2600 | nan | nan | 0.0 | nan | 0.0008 | nan | 0.0 | nan |
| 0.0008 | 14.0 | 2800 | nan | nan | 0.0 | nan | 0.0007 | nan | 0.0 | nan |
| 0.0007 | 15.0 | 3000 | nan | nan | 0.0 | nan | 0.0006 | nan | 0.0 | nan |
| 0.0006 | 16.0 | 3200 | nan | nan | 0.0 | nan | 0.0005 | nan | 0.0 | nan |
| 0.0006 | 17.0 | 3400 | nan | nan | 0.0 | nan | 0.0005 | nan | 0.0 | nan |
| 0.0005 | 18.0 | 3600 | nan | nan | 0.0 | nan | 0.0004 | nan | 0.0 | nan |
| 0.0005 | 19.0 | 3800 | nan | nan | 0.0 | nan | 0.0004 | nan | 0.0 | nan |
| 0.0004 | 20.0 | 4000 | nan | nan | 0.0 | nan | 0.0004 | nan | 0.0 | nan |
| 0.0004 | 21.0 | 4200 | nan | nan | 0.0 | nan | 0.0003 | nan | 0.0 | nan |
| 0.0004 | 22.0 | 4400 | nan | nan | 0.0 | nan | 0.0003 | nan | 0.0 | nan |
| 0.0003 | 23.0 | 4600 | nan | nan | 0.0 | nan | 0.0003 | nan | 0.0 | nan |
| 0.0003 | 24.0 | 4800 | nan | nan | 0.0 | nan | 0.0003 | nan | 0.0 | nan |
| 0.0003 | 25.0 | 5000 | nan | nan | 0.0 | nan | 0.0003 | nan | 0.0 | nan |
| 0.0003 | 26.0 | 5200 | nan | nan | 0.0 | nan | 0.0002 | nan | 0.0 | nan |
| 0.0003 | 27.0 | 5400 | nan | nan | 0.0 | nan | 0.0002 | nan | 0.0 | nan |
| 0.0003 | 28.0 | 5600 | nan | nan | 0.0 | nan | 0.0002 | nan | 0.0 | nan |
| 0.0003 | 29.0 | 5800 | nan | nan | 0.0 | nan | 0.0002 | nan | 0.0 | nan |
| 0.0003 | 30.0 | 6000 | nan | nan | 0.0 | nan | 0.0002 | nan | 0.0 | nan |
| 0.3863 | 31.0 | 6200 | 0.3509 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.3663 | 32.0 | 6400 | 0.3449 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.3563 | 33.0 | 6600 | 0.3311 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.3449 | 34.0 | 6800 | 0.3255 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 |
| 0.3436 | 35.0 | 7000 | 0.3307 | 0.0051 | 0.0103 | 0.0103 | nan | 0.0103 | 0.0 | 0.0103 |
| 0.3563 | 36.0 | 7200 | 0.3274 | 0.0109 | 0.0218 | 0.0218 | nan | 0.0218 | 0.0 | 0.0218 |
| 0.3238 | 37.0 | 7400 | 0.3198 | 0.0790 | 0.1581 | 0.1581 | nan | 0.1581 | 0.0 | 0.1581 |
| 0.3264 | 38.0 | 7600 | 0.3559 | 0.0124 | 0.0247 | 0.0247 | nan | 0.0247 | 0.0 | 0.0247 |
| 0.2876 | 39.0 | 7800 | 0.3287 | 0.0381 | 0.0763 | 0.0763 | nan | 0.0763 | 0.0 | 0.0763 |
| 0.3015 | 40.0 | 8000 | 0.3155 | 0.0277 | 0.0554 | 0.0554 | nan | 0.0554 | 0.0 | 0.0554 |
| 0.3091 | 41.0 | 8200 | 0.3529 | 0.0132 | 0.0263 | 0.0263 | nan | 0.0263 | 0.0 | 0.0263 |
| 0.2939 | 42.0 | 8400 | 0.3105 | 0.1305 | 0.2609 | 0.2609 | nan | 0.2609 | 0.0 | 0.2609 |
| 0.2852 | 43.0 | 8600 | 0.3335 | 0.0190 | 0.0380 | 0.0380 | nan | 0.0380 | 0.0 | 0.0380 |
| 0.2775 | 44.0 | 8800 | 0.3325 | 0.0381 | 0.0762 | 0.0762 | nan | 0.0762 | 0.0 | 0.0762 |
| 0.273 | 45.0 | 9000 | 0.3285 | 0.0835 | 0.1669 | 0.1669 | nan | 0.1669 | 0.0 | 0.1669 |
| 0.271 | 46.0 | 9200 | 0.3186 | 0.0519 | 0.1039 | 0.1039 | nan | 0.1039 | 0.0 | 0.1039 |
| 0.3177 | 47.0 | 9400 | 0.3264 | 0.0378 | 0.0756 | 0.0756 | nan | 0.0756 | 0.0 | 0.0756 |
| 0.2561 | 48.0 | 9600 | 0.2901 | 0.0987 | 0.1974 | 0.1974 | nan | 0.1974 | 0.0 | 0.1974 |
| 0.2707 | 49.0 | 9800 | 0.3176 | 0.0903 | 0.1807 | 0.1807 | nan | 0.1807 | 0.0 | 0.1807 |
| 0.2658 | 50.0 | 10000 | 0.3148 | 0.0465 | 0.0930 | 0.0930 | nan | 0.0930 | 0.0 | 0.0930 |
| 0.2433 | 51.0 | 10200 | 0.3042 | 0.0735 | 0.1470 | 0.1470 | nan | 0.1470 | 0.0 | 0.1470 |
| 0.2489 | 52.0 | 10400 | 0.2871 | 0.1168 | 0.2336 | 0.2336 | nan | 0.2336 | 0.0 | 0.2336 |
| 0.2542 | 53.0 | 10600 | 0.2724 | 0.1409 | 0.2819 | 0.2819 | nan | 0.2819 | 0.0 | 0.2819 |
| 0.2683 | 54.0 | 10800 | 0.2767 | 0.1094 | 0.2189 | 0.2189 | nan | 0.2189 | 0.0 | 0.2189 |
| 0.2473 | 55.0 | 11000 | 0.2957 | 0.1457 | 0.2915 | 0.2915 | nan | 0.2915 | 0.0 | 0.2915 |
| 0.2296 | 56.0 | 11200 | 0.3183 | 0.0671 | 0.1343 | 0.1343 | nan | 0.1343 | 0.0 | 0.1343 |
| 0.2359 | 57.0 | 11400 | 0.3185 | 0.0603 | 0.1206 | 0.1206 | nan | 0.1206 | 0.0 | 0.1206 |
| 0.2609 | 58.0 | 11600 | 0.3024 | 0.1323 | 0.2645 | 0.2645 | nan | 0.2645 | 0.0 | 0.2645 |
| 0.2577 | 59.0 | 11800 | 0.3330 | 0.0746 | 0.1492 | 0.1492 | nan | 0.1492 | 0.0 | 0.1492 |
| 0.2296 | 60.0 | 12000 | 0.3051 | 0.1009 | 0.2017 | 0.2017 | nan | 0.2017 | 0.0 | 0.2017 |
| 0.2389 | 61.0 | 12200 | 0.3493 | 0.0740 | 0.1479 | 0.1479 | nan | 0.1479 | 0.0 | 0.1479 |
| 0.2307 | 62.0 | 12400 | 0.2955 | 0.2280 | 0.4560 | 0.4560 | nan | 0.4560 | 0.0 | 0.4560 |
| 0.2364 | 63.0 | 12600 | 0.3018 | 0.1304 | 0.2607 | 0.2607 | nan | 0.2607 | 0.0 | 0.2607 |
| 0.2362 | 64.0 | 12800 | 0.3658 | 0.0673 | 0.1346 | 0.1346 | nan | 0.1346 | 0.0 | 0.1346 |
| 0.2198 | 65.0 | 13000 | 0.3355 | 0.0798 | 0.1595 | 0.1595 | nan | 0.1595 | 0.0 | 0.1595 |
| 0.2368 | 66.0 | 13200 | 0.2951 | 0.1379 | 0.2758 | 0.2758 | nan | 0.2758 | 0.0 | 0.2758 |
| 0.2094 | 67.0 | 13400 | 0.3200 | 0.0869 | 0.1739 | 0.1739 | nan | 0.1739 | 0.0 | 0.1739 |
| 0.2189 | 68.0 | 13600 | 0.3350 | 0.0969 | 0.1938 | 0.1938 | nan | 0.1938 | 0.0 | 0.1938 |
| 0.206 | 69.0 | 13800 | 0.3307 | 0.1245 | 0.2489 | 0.2489 | nan | 0.2489 | 0.0 | 0.2489 |
| 0.2194 | 70.0 | 14000 | 0.3538 | 0.0615 | 0.1231 | 0.1231 | nan | 0.1231 | 0.0 | 0.1231 |
| 0.2043 | 71.0 | 14200 | 0.3206 | 0.0994 | 0.1988 | 0.1988 | nan | 0.1988 | 0.0 | 0.1988 |
| 0.2004 | 72.0 | 14400 | 0.3028 | 0.1384 | 0.2768 | 0.2768 | nan | 0.2768 | 0.0 | 0.2768 |
| 0.2097 | 73.0 | 14600 | 0.3521 | 0.1193 | 0.2386 | 0.2386 | nan | 0.2386 | 0.0 | 0.2386 |
| 0.1835 | 74.0 | 14800 | 0.3546 | 0.1214 | 0.2428 | 0.2428 | nan | 0.2428 | 0.0 | 0.2428 |
| 0.205 | 75.0 | 15000 | 0.3291 | 0.0999 | 0.1999 | 0.1999 | nan | 0.1999 | 0.0 | 0.1999 |
| 0.1994 | 76.0 | 15200 | 0.3340 | 0.0769 | 0.1537 | 0.1537 | nan | 0.1537 | 0.0 | 0.1537 |
| 0.1973 | 77.0 | 15400 | 0.3124 | 0.1802 | 0.3604 | 0.3604 | nan | 0.3604 | 0.0 | 0.3604 |
| 0.1942 | 78.0 | 15600 | 0.3522 | 0.0889 | 0.1777 | 0.1777 | nan | 0.1777 | 0.0 | 0.1777 |
| 0.1753 | 79.0 | 15800 | 0.3308 | 0.1629 | 0.3257 | 0.3257 | nan | 0.3257 | 0.0 | 0.3257 |
| 0.1862 | 80.0 | 16000 | 0.3474 | 0.1013 | 0.2026 | 0.2026 | nan | 0.2026 | 0.0 | 0.2026 |
| 0.2032 | 81.0 | 16200 | 0.3042 | 0.1799 | 0.3599 | 0.3599 | nan | 0.3599 | 0.0 | 0.3599 |
| 0.1746 | 82.0 | 16400 | 0.3452 | 0.1168 | 0.2336 | 0.2336 | nan | 0.2336 | 0.0 | 0.2336 |
| 0.2017 | 83.0 | 16600 | 0.3392 | 0.1316 | 0.2632 | 0.2632 | nan | 0.2632 | 0.0 | 0.2632 |
| 0.1723 | 84.0 | 16800 | 0.3105 | 0.2185 | 0.4370 | 0.4370 | nan | 0.4370 | 0.0 | 0.4370 |
| 0.1756 | 85.0 | 17000 | 0.3224 | 0.1250 | 0.2501 | 0.2501 | nan | 0.2501 | 0.0 | 0.2501 |
| 0.165 | 86.0 | 17200 | 0.3527 | 0.1024 | 0.2047 | 0.2047 | nan | 0.2047 | 0.0 | 0.2047 |
| 0.1642 | 87.0 | 17400 | 0.3347 | 0.1254 | 0.2508 | 0.2508 | nan | 0.2508 | 0.0 | 0.2508 |
| 0.1801 | 88.0 | 17600 | 0.3297 | 0.1228 | 0.2456 | 0.2456 | nan | 0.2456 | 0.0 | 0.2456 |
| 0.166 | 89.0 | 17800 | 0.2994 | 0.2059 | 0.4119 | 0.4119 | nan | 0.4119 | 0.0 | 0.4119 |
| 0.1854 | 90.0 | 18000 | 0.3618 | 0.1591 | 0.3182 | 0.3182 | nan | 0.3182 | 0.0 | 0.3182 |
| 0.1645 | 91.0 | 18200 | 0.3185 | 0.1681 | 0.3362 | 0.3362 | nan | 0.3362 | 0.0 | 0.3362 |
| 0.1594 | 92.0 | 18400 | 0.2970 | 0.1969 | 0.3939 | 0.3939 | nan | 0.3939 | 0.0 | 0.3939 |
| 0.1585 | 93.0 | 18600 | 0.3421 | 0.1637 | 0.3275 | 0.3275 | nan | 0.3275 | 0.0 | 0.3275 |
| 0.157 | 94.0 | 18800 | 0.3398 | 0.2062 | 0.4124 | 0.4124 | nan | 0.4124 | 0.0 | 0.4124 |
| 0.1504 | 95.0 | 19000 | 0.3301 | 0.1930 | 0.3860 | 0.3860 | nan | 0.3860 | 0.0 | 0.3860 |
| 0.1723 | 96.0 | 19200 | 0.4179 | 0.0796 | 0.1592 | 0.1592 | nan | 0.1592 | 0.0 | 0.1592 |
| 0.1498 | 97.0 | 19400 | 0.3558 | 0.1449 | 0.2898 | 0.2898 | nan | 0.2898 | 0.0 | 0.2898 |
| 0.1434 | 98.0 | 19600 | 0.3500 | 0.1437 | 0.2874 | 0.2874 | nan | 0.2874 | 0.0 | 0.2874 |
| 0.1482 | 99.0 | 19800 | 0.3889 | 0.1434 | 0.2869 | 0.2869 | nan | 0.2869 | 0.0 | 0.2869 |
| 0.1611 | 100.0 | 20000 | 0.3552 | 0.1443 | 0.2885 | 0.2885 | nan | 0.2885 | 0.0 | 0.2885 |
| 0.1458 | 101.0 | 20200 | 0.3581 | 0.1546 | 0.3093 | 0.3093 | nan | 0.3093 | 0.0 | 0.3093 |
| 0.1715 | 102.0 | 20400 | 0.2979 | 0.2220 | 0.4440 | 0.4440 | nan | 0.4440 | 0.0 | 0.4440 |
| 0.1448 | 103.0 | 20600 | 0.3976 | 0.1216 | 0.2433 | 0.2433 | nan | 0.2433 | 0.0 | 0.2433 |
| 0.1618 | 104.0 | 20800 | 0.3676 | 0.1308 | 0.2616 | 0.2616 | nan | 0.2616 | 0.0 | 0.2616 |
| 0.1505 | 105.0 | 21000 | 0.3864 | 0.1239 | 0.2478 | 0.2478 | nan | 0.2478 | 0.0 | 0.2478 |
| 0.1429 | 106.0 | 21200 | 0.3338 | 0.1486 | 0.2972 | 0.2972 | nan | 0.2972 | 0.0 | 0.2972 |
| 0.1603 | 107.0 | 21400 | 0.3180 | 0.1818 | 0.3635 | 0.3635 | nan | 0.3635 | 0.0 | 0.3635 |
| 0.1315 | 108.0 | 21600 | 0.3249 | 0.1929 | 0.3858 | 0.3858 | nan | 0.3858 | 0.0 | 0.3858 |
| 0.1529 | 109.0 | 21800 | 0.3415 | 0.1390 | 0.2781 | 0.2781 | nan | 0.2781 | 0.0 | 0.2781 |
| 0.1411 | 110.0 | 22000 | 0.3496 | 0.1615 | 0.3231 | 0.3231 | nan | 0.3231 | 0.0 | 0.3231 |
| 0.1436 | 111.0 | 22200 | 0.3561 | 0.1728 | 0.3457 | 0.3457 | nan | 0.3457 | 0.0 | 0.3457 |
| 0.1435 | 112.0 | 22400 | 0.3210 | 0.2288 | 0.4576 | 0.4576 | nan | 0.4576 | 0.0 | 0.4576 |
| 0.1399 | 113.0 | 22600 | 0.3329 | 0.2105 | 0.4211 | 0.4211 | nan | 0.4211 | 0.0 | 0.4211 |
| 0.1502 | 114.0 | 22800 | 0.3317 | 0.2070 | 0.4140 | 0.4140 | nan | 0.4140 | 0.0 | 0.4140 |
| 0.1278 | 115.0 | 23000 | 0.3524 | 0.1437 | 0.2873 | 0.2873 | nan | 0.2873 | 0.0 | 0.2873 |
| 0.1404 | 116.0 | 23200 | 0.4152 | 0.1668 | 0.3337 | 0.3337 | nan | 0.3337 | 0.0 | 0.3337 |
| 0.1509 | 117.0 | 23400 | 0.3999 | 0.1182 | 0.2365 | 0.2365 | nan | 0.2365 | 0.0 | 0.2365 |
| 0.1318 | 118.0 | 23600 | 0.3659 | 0.1791 | 0.3583 | 0.3583 | nan | 0.3583 | 0.0 | 0.3583 |
| 0.1415 | 119.0 | 23800 | 0.3563 | 0.1570 | 0.3139 | 0.3139 | nan | 0.3139 | 0.0 | 0.3139 |
| 0.1314 | 120.0 | 24000 | 0.3923 | 0.1296 | 0.2593 | 0.2593 | nan | 0.2593 | 0.0 | 0.2593 |
| 0.1352 | 121.0 | 24200 | 0.3806 | 0.1428 | 0.2856 | 0.2856 | nan | 0.2856 | 0.0 | 0.2856 |
| 0.1246 | 122.0 | 24400 | 0.3655 | 0.1824 | 0.3648 | 0.3648 | nan | 0.3648 | 0.0 | 0.3648 |
| 0.1304 | 123.0 | 24600 | 0.3684 | 0.1489 | 0.2977 | 0.2977 | nan | 0.2977 | 0.0 | 0.2977 |
| 0.1336 | 124.0 | 24800 | 0.3557 | 0.1777 | 0.3554 | 0.3554 | nan | 0.3554 | 0.0 | 0.3554 |
| 0.1298 | 125.0 | 25000 | 0.3803 | 0.1568 | 0.3137 | 0.3137 | nan | 0.3137 | 0.0 | 0.3137 |
| 0.1415 | 126.0 | 25200 | 0.3505 | 0.2155 | 0.4310 | 0.4310 | nan | 0.4310 | 0.0 | 0.4310 |
| 0.1379 | 127.0 | 25400 | 0.3882 | 0.1559 | 0.3117 | 0.3117 | nan | 0.3117 | 0.0 | 0.3117 |
| 0.1404 | 128.0 | 25600 | 0.4062 | 0.1630 | 0.3261 | 0.3261 | nan | 0.3261 | 0.0 | 0.3261 |
| 0.1139 | 129.0 | 25800 | 0.3952 | 0.1484 | 0.2968 | 0.2968 | nan | 0.2968 | 0.0 | 0.2968 |
| 0.1248 | 130.0 | 26000 | 0.4175 | 0.1519 | 0.3037 | 0.3037 | nan | 0.3037 | 0.0 | 0.3037 |
| 0.1327 | 131.0 | 26200 | 0.4188 | 0.1596 | 0.3193 | 0.3193 | nan | 0.3193 | 0.0 | 0.3193 |
| 0.1196 | 132.0 | 26400 | 0.3954 | 0.1555 | 0.3110 | 0.3110 | nan | 0.3110 | 0.0 | 0.3110 |
| 0.1368 | 133.0 | 26600 | 0.3569 | 0.1999 | 0.3998 | 0.3998 | nan | 0.3998 | 0.0 | 0.3998 |
| 0.1082 | 134.0 | 26800 | 0.4083 | 0.1299 | 0.2597 | 0.2597 | nan | 0.2597 | 0.0 | 0.2597 |
| 0.1208 | 135.0 | 27000 | 0.3600 | 0.1744 | 0.3487 | 0.3487 | nan | 0.3487 | 0.0 | 0.3487 |
| 0.1291 | 136.0 | 27200 | 0.3654 | 0.1714 | 0.3428 | 0.3428 | nan | 0.3428 | 0.0 | 0.3428 |
| 0.1187 | 137.0 | 27400 | 0.3945 | 0.1727 | 0.3453 | 0.3453 | nan | 0.3453 | 0.0 | 0.3453 |
| 0.1109 | 138.0 | 27600 | 0.4242 | 0.1790 | 0.3579 | 0.3579 | nan | 0.3579 | 0.0 | 0.3579 |
| 0.124 | 139.0 | 27800 | 0.3865 | 0.1699 | 0.3399 | 0.3399 | nan | 0.3399 | 0.0 | 0.3399 |
| 0.106 | 140.0 | 28000 | 0.3950 | 0.1897 | 0.3793 | 0.3793 | nan | 0.3793 | 0.0 | 0.3793 |
| 0.1208 | 141.0 | 28200 | 0.4181 | 0.1561 | 0.3122 | 0.3122 | nan | 0.3122 | 0.0 | 0.3122 |
| 0.1199 | 142.0 | 28400 | 0.3747 | 0.1872 | 0.3745 | 0.3745 | nan | 0.3745 | 0.0 | 0.3745 |
| 0.1181 | 143.0 | 28600 | 0.4158 | 0.1831 | 0.3662 | 0.3662 | nan | 0.3662 | 0.0 | 0.3662 |
| 0.1192 | 144.0 | 28800 | 0.4764 | 0.1040 | 0.2080 | 0.2080 | nan | 0.2080 | 0.0 | 0.2080 |
| 0.1388 | 145.0 | 29000 | 0.3644 | 0.1885 | 0.3770 | 0.3770 | nan | 0.3770 | 0.0 | 0.3770 |
| 0.1163 | 146.0 | 29200 | 0.4045 | 0.1505 | 0.3010 | 0.3010 | nan | 0.3010 | 0.0 | 0.3010 |
| 0.1062 | 147.0 | 29400 | 0.4440 | 0.1596 | 0.3192 | 0.3192 | nan | 0.3192 | 0.0 | 0.3192 |
| 0.1056 | 148.0 | 29600 | 0.3610 | 0.2258 | 0.4515 | 0.4515 | nan | 0.4515 | 0.0 | 0.4515 |
| 0.1031 | 149.0 | 29800 | 0.4054 | 0.1790 | 0.3580 | 0.3580 | nan | 0.3580 | 0.0 | 0.3580 |
| 0.1067 | 150.0 | 30000 | 0.4192 | 0.1701 | 0.3402 | 0.3402 | nan | 0.3402 | 0.0 | 0.3402 |
| 0.1073 | 151.0 | 30200 | 0.4244 | 0.1607 | 0.3214 | 0.3214 | nan | 0.3214 | 0.0 | 0.3214 |
| 0.1012 | 152.0 | 30400 | 0.3834 | 0.1776 | 0.3553 | 0.3553 | nan | 0.3553 | 0.0 | 0.3553 |
| 0.1401 | 153.0 | 30600 | 0.3860 | 0.1813 | 0.3625 | 0.3625 | nan | 0.3625 | 0.0 | 0.3625 |
| 0.0952 | 154.0 | 30800 | 0.3846 | 0.1776 | 0.3553 | 0.3553 | nan | 0.3553 | 0.0 | 0.3553 |
| 0.107 | 155.0 | 31000 | 0.4437 | 0.1360 | 0.2720 | 0.2720 | nan | 0.2720 | 0.0 | 0.2720 |
| 0.0928 | 156.0 | 31200 | 0.4810 | 0.1399 | 0.2799 | 0.2799 | nan | 0.2799 | 0.0 | 0.2799 |
| 0.1014 | 157.0 | 31400 | 0.4275 | 0.1579 | 0.3158 | 0.3158 | nan | 0.3158 | 0.0 | 0.3158 |
| 0.1048 | 158.0 | 31600 | 0.4289 | 0.1580 | 0.3160 | 0.3160 | nan | 0.3160 | 0.0 | 0.3160 |
| 0.1069 | 159.0 | 31800 | 0.4172 | 0.1793 | 0.3585 | 0.3585 | nan | 0.3585 | 0.0 | 0.3585 |
| 0.0905 | 160.0 | 32000 | 0.4604 | 0.1479 | 0.2958 | 0.2958 | nan | 0.2958 | 0.0 | 0.2958 |
| 0.1061 | 161.0 | 32200 | 0.4001 | 0.1837 | 0.3674 | 0.3674 | nan | 0.3674 | 0.0 | 0.3674 |
| 0.1098 | 162.0 | 32400 | 0.3794 | 0.1994 | 0.3987 | 0.3987 | nan | 0.3987 | 0.0 | 0.3987 |
| 0.1064 | 163.0 | 32600 | 0.4152 | 0.1815 | 0.3629 | 0.3629 | nan | 0.3629 | 0.0 | 0.3629 |
| 0.0907 | 164.0 | 32800 | 0.4278 | 0.1724 | 0.3447 | 0.3447 | nan | 0.3447 | 0.0 | 0.3447 |
| 0.0982 | 165.0 | 33000 | 0.3928 | 0.1833 | 0.3666 | 0.3666 | nan | 0.3666 | 0.0 | 0.3666 |
| 0.1118 | 166.0 | 33200 | 0.3973 | 0.1823 | 0.3647 | 0.3647 | nan | 0.3647 | 0.0 | 0.3647 |
| 0.0851 | 167.0 | 33400 | 0.4348 | 0.1362 | 0.2724 | 0.2724 | nan | 0.2724 | 0.0 | 0.2724 |
| 0.0969 | 168.0 | 33600 | 0.4289 | 0.1469 | 0.2938 | 0.2938 | nan | 0.2938 | 0.0 | 0.2938 |
| 0.0846 | 169.0 | 33800 | 0.4067 | 0.1943 | 0.3887 | 0.3887 | nan | 0.3887 | 0.0 | 0.3887 |
| 0.104 | 170.0 | 34000 | 0.4138 | 0.2032 | 0.4064 | 0.4064 | nan | 0.4064 | 0.0 | 0.4064 |
| 0.0953 | 171.0 | 34200 | 0.4546 | 0.1566 | 0.3131 | 0.3131 | nan | 0.3131 | 0.0 | 0.3131 |
| 0.0904 | 172.0 | 34400 | 0.4305 | 0.1682 | 0.3364 | 0.3364 | nan | 0.3364 | 0.0 | 0.3364 |
| 0.1049 | 173.0 | 34600 | 0.4279 | 0.1909 | 0.3817 | 0.3817 | nan | 0.3817 | 0.0 | 0.3817 |
| 0.0896 | 174.0 | 34800 | 0.4234 | 0.1818 | 0.3635 | 0.3635 | nan | 0.3635 | 0.0 | 0.3635 |
| 0.0844 | 175.0 | 35000 | 0.3966 | 0.1820 | 0.3640 | 0.3640 | nan | 0.3640 | 0.0 | 0.3640 |
| 0.0924 | 176.0 | 35200 | 0.4198 | 0.1936 | 0.3871 | 0.3871 | nan | 0.3871 | 0.0 | 0.3871 |
| 0.0915 | 177.0 | 35400 | 0.4725 | 0.1273 | 0.2545 | 0.2545 | nan | 0.2545 | 0.0 | 0.2545 |
| 0.0809 | 178.0 | 35600 | 0.4902 | 0.1346 | 0.2693 | 0.2693 | nan | 0.2693 | 0.0 | 0.2693 |
| 0.0826 | 179.0 | 35800 | 0.4810 | 0.1375 | 0.2750 | 0.2750 | nan | 0.2750 | 0.0 | 0.2750 |
| 0.0946 | 180.0 | 36000 | 0.4850 | 0.1579 | 0.3157 | 0.3157 | nan | 0.3157 | 0.0 | 0.3157 |
| 0.0998 | 181.0 | 36200 | 0.4144 | 0.1757 | 0.3514 | 0.3514 | nan | 0.3514 | 0.0 | 0.3514 |
| 0.0853 | 182.0 | 36400 | 0.4106 | 0.1865 | 0.3730 | 0.3730 | nan | 0.3730 | 0.0 | 0.3730 |
| 0.0929 | 183.0 | 36600 | 0.4150 | 0.1760 | 0.3520 | 0.3520 | nan | 0.3520 | 0.0 | 0.3520 |
| 0.0841 | 184.0 | 36800 | 0.4080 | 0.1833 | 0.3667 | 0.3667 | nan | 0.3667 | 0.0 | 0.3667 |
| 0.0968 | 185.0 | 37000 | 0.4380 | 0.1606 | 0.3213 | 0.3213 | nan | 0.3213 | 0.0 | 0.3213 |
| 0.0969 | 186.0 | 37200 | 0.4411 | 0.1477 | 0.2955 | 0.2955 | nan | 0.2955 | 0.0 | 0.2955 |
| 0.0766 | 187.0 | 37400 | 0.4158 | 0.1885 | 0.3770 | 0.3770 | nan | 0.3770 | 0.0 | 0.3770 |
| 0.0942 | 188.0 | 37600 | 0.4429 | 0.1632 | 0.3265 | 0.3265 | nan | 0.3265 | 0.0 | 0.3265 |
| 0.088 | 189.0 | 37800 | 0.4197 | 0.2039 | 0.4078 | 0.4078 | nan | 0.4078 | 0.0 | 0.4078 |
| 0.1127 | 190.0 | 38000 | 0.4447 | 0.1986 | 0.3972 | 0.3972 | nan | 0.3972 | 0.0 | 0.3972 |
| 0.0877 | 191.0 | 38200 | 0.4299 | 0.1790 | 0.3579 | 0.3579 | nan | 0.3579 | 0.0 | 0.3579 |
| 0.0822 | 192.0 | 38400 | 0.4455 | 0.1606 | 0.3211 | 0.3211 | nan | 0.3211 | 0.0 | 0.3211 |
| 0.0878 | 193.0 | 38600 | 0.4550 | 0.1531 | 0.3061 | 0.3061 | nan | 0.3061 | 0.0 | 0.3061 |
| 0.0778 | 194.0 | 38800 | 0.4967 | 0.1439 | 0.2879 | 0.2879 | nan | 0.2879 | 0.0 | 0.2879 |
| 0.0791 | 195.0 | 39000 | 0.4543 | 0.1784 | 0.3568 | 0.3568 | nan | 0.3568 | 0.0 | 0.3568 |
| 0.091 | 196.0 | 39200 | 0.4568 | 0.1772 | 0.3544 | 0.3544 | nan | 0.3544 | 0.0 | 0.3544 |
| 0.0773 | 197.0 | 39400 | 0.4261 | 0.1766 | 0.3531 | 0.3531 | nan | 0.3531 | 0.0 | 0.3531 |
| 0.0888 | 198.0 | 39600 | 0.4710 | 0.1770 | 0.3540 | 0.3540 | nan | 0.3540 | 0.0 | 0.3540 |
| 0.0864 | 199.0 | 39800 | 0.4497 | 0.1718 | 0.3436 | 0.3436 | nan | 0.3436 | 0.0 | 0.3436 |
| 0.0749 | 200.0 | 40000 | 0.4302 | 0.1758 | 0.3515 | 0.3515 | nan | 0.3515 | 0.0 | 0.3515 |
| 0.0849 | 201.0 | 40200 | 0.4738 | 0.1593 | 0.3187 | 0.3187 | nan | 0.3187 | 0.0 | 0.3187 |
| 0.0727 | 202.0 | 40400 | 0.5169 | 0.1392 | 0.2784 | 0.2784 | nan | 0.2784 | 0.0 | 0.2784 |
| 0.076 | 203.0 | 40600 | 0.4296 | 0.1833 | 0.3667 | 0.3667 | nan | 0.3667 | 0.0 | 0.3667 |
| 0.0832 | 204.0 | 40800 | 0.4600 | 0.1804 | 0.3608 | 0.3608 | nan | 0.3608 | 0.0 | 0.3608 |
| 0.1003 | 205.0 | 41000 | 0.4732 | 0.1383 | 0.2767 | 0.2767 | nan | 0.2767 | 0.0 | 0.2767 |
| 0.0765 | 206.0 | 41200 | 0.4839 | 0.1625 | 0.3250 | 0.3250 | nan | 0.3250 | 0.0 | 0.3250 |
| 0.0983 | 207.0 | 41400 | 0.4489 | 0.1958 | 0.3916 | 0.3916 | nan | 0.3916 | 0.0 | 0.3916 |
| 0.0854 | 208.0 | 41600 | 0.4425 | 0.1763 | 0.3526 | 0.3526 | nan | 0.3526 | 0.0 | 0.3526 |
| 0.0742 | 209.0 | 41800 | 0.4470 | 0.1555 | 0.3110 | 0.3110 | nan | 0.3110 | 0.0 | 0.3110 |
| 0.0743 | 210.0 | 42000 | 0.5047 | 0.1411 | 0.2821 | 0.2821 | nan | 0.2821 | 0.0 | 0.2821 |
| 0.0764 | 211.0 | 42200 | 0.4252 | 0.2074 | 0.4149 | 0.4149 | nan | 0.4149 | 0.0 | 0.4149 |
| 0.0779 | 212.0 | 42400 | 0.4772 | 0.1463 | 0.2927 | 0.2927 | nan | 0.2927 | 0.0 | 0.2927 |
| 0.092 | 213.0 | 42600 | 0.4905 | 0.1428 | 0.2856 | 0.2856 | nan | 0.2856 | 0.0 | 0.2856 |
| 0.0756 | 214.0 | 42800 | 0.4657 | 0.1668 | 0.3336 | 0.3336 | nan | 0.3336 | 0.0 | 0.3336 |
| 0.0896 | 215.0 | 43000 | 0.4542 | 0.1620 | 0.3240 | 0.3240 | nan | 0.3240 | 0.0 | 0.3240 |
| 0.0709 | 216.0 | 43200 | 0.4828 | 0.1585 | 0.3170 | 0.3170 | nan | 0.3170 | 0.0 | 0.3170 |
| 0.0841 | 217.0 | 43400 | 0.4861 | 0.1549 | 0.3099 | 0.3099 | nan | 0.3099 | 0.0 | 0.3099 |
| 0.0905 | 218.0 | 43600 | 0.5044 | 0.1656 | 0.3313 | 0.3313 | nan | 0.3313 | 0.0 | 0.3313 |
| 0.0824 | 219.0 | 43800 | 0.4952 | 0.1415 | 0.2829 | 0.2829 | nan | 0.2829 | 0.0 | 0.2829 |
| 0.0763 | 220.0 | 44000 | 0.4828 | 0.1473 | 0.2946 | 0.2946 | nan | 0.2946 | 0.0 | 0.2946 |
| 0.069 | 221.0 | 44200 | 0.5279 | 0.1378 | 0.2756 | 0.2756 | nan | 0.2756 | 0.0 | 0.2756 |
| 0.0754 | 222.0 | 44400 | 0.4968 | 0.1639 | 0.3277 | 0.3277 | nan | 0.3277 | 0.0 | 0.3277 |
| 0.0698 | 223.0 | 44600 | 0.4330 | 0.1791 | 0.3582 | 0.3582 | nan | 0.3582 | 0.0 | 0.3582 |
| 0.0782 | 224.0 | 44800 | 0.4973 | 0.1476 | 0.2952 | 0.2952 | nan | 0.2952 | 0.0 | 0.2952 |
| 0.0837 | 225.0 | 45000 | 0.4926 | 0.1830 | 0.3660 | 0.3660 | nan | 0.3660 | 0.0 | 0.3660 |
| 0.0784 | 226.0 | 45200 | 0.4883 | 0.1751 | 0.3502 | 0.3502 | nan | 0.3502 | 0.0 | 0.3502 |
| 0.0715 | 227.0 | 45400 | 0.5459 | 0.1419 | 0.2838 | 0.2838 | nan | 0.2838 | 0.0 | 0.2838 |
| 0.0739 | 228.0 | 45600 | 0.5471 | 0.1509 | 0.3018 | 0.3018 | nan | 0.3018 | 0.0 | 0.3018 |
| 0.0722 | 229.0 | 45800 | 0.4645 | 0.1761 | 0.3522 | 0.3522 | nan | 0.3522 | 0.0 | 0.3522 |
| 0.0777 | 230.0 | 46000 | 0.4787 | 0.1775 | 0.3550 | 0.3550 | nan | 0.3550 | 0.0 | 0.3550 |
| 0.0804 | 231.0 | 46200 | 0.4783 | 0.1717 | 0.3435 | 0.3435 | nan | 0.3435 | 0.0 | 0.3435 |
| 0.0762 | 232.0 | 46400 | 0.4301 | 0.1811 | 0.3623 | 0.3623 | nan | 0.3623 | 0.0 | 0.3623 |
| 0.07 | 233.0 | 46600 | 0.4978 | 0.1657 | 0.3315 | 0.3315 | nan | 0.3315 | 0.0 | 0.3315 |
| 0.0656 | 234.0 | 46800 | 0.4942 | 0.1628 | 0.3256 | 0.3256 | nan | 0.3256 | 0.0 | 0.3256 |
| 0.0653 | 235.0 | 47000 | 0.4773 | 0.1633 | 0.3267 | 0.3267 | nan | 0.3267 | 0.0 | 0.3267 |
| 0.0719 | 236.0 | 47200 | 0.4606 | 0.1863 | 0.3726 | 0.3726 | nan | 0.3726 | 0.0 | 0.3726 |
| 0.0813 | 237.0 | 47400 | 0.4666 | 0.1713 | 0.3427 | 0.3427 | nan | 0.3427 | 0.0 | 0.3427 |
| 0.074 | 238.0 | 47600 | 0.4619 | 0.1870 | 0.3740 | 0.3740 | nan | 0.3740 | 0.0 | 0.3740 |
| 0.0657 | 239.0 | 47800 | 0.4785 | 0.1580 | 0.3160 | 0.3160 | nan | 0.3160 | 0.0 | 0.3160 |
| 0.0699 | 240.0 | 48000 | 0.4868 | 0.1625 | 0.3250 | 0.3250 | nan | 0.3250 | 0.0 | 0.3250 |
| 0.07 | 241.0 | 48200 | 0.4609 | 0.1768 | 0.3535 | 0.3535 | nan | 0.3535 | 0.0 | 0.3535 |
| 0.0657 | 242.0 | 48400 | 0.4922 | 0.1744 | 0.3489 | 0.3489 | nan | 0.3489 | 0.0 | 0.3489 |
| 0.0701 | 243.0 | 48600 | 0.5228 | 0.1563 | 0.3126 | 0.3126 | nan | 0.3126 | 0.0 | 0.3126 |
| 0.0713 | 244.0 | 48800 | 0.4929 | 0.1558 | 0.3116 | 0.3116 | nan | 0.3116 | 0.0 | 0.3116 |
| 0.0722 | 245.0 | 49000 | 0.4586 | 0.1810 | 0.3620 | 0.3620 | nan | 0.3620 | 0.0 | 0.3620 |
| 0.0627 | 246.0 | 49200 | 0.4928 | 0.1552 | 0.3103 | 0.3103 | nan | 0.3103 | 0.0 | 0.3103 |
| 0.0688 | 247.0 | 49400 | 0.5193 | 0.1531 | 0.3063 | 0.3063 | nan | 0.3063 | 0.0 | 0.3063 |
| 0.0671 | 248.0 | 49600 | 0.4756 | 0.1916 | 0.3832 | 0.3832 | nan | 0.3832 | 0.0 | 0.3832 |
| 0.0583 | 249.0 | 49800 | 0.5171 | 0.1532 | 0.3064 | 0.3064 | nan | 0.3064 | 0.0 | 0.3064 |
| 0.0702 | 250.0 | 50000 | 0.4813 | 0.1711 | 0.3421 | 0.3421 | nan | 0.3421 | 0.0 | 0.3421 |
| 0.0723 | 251.0 | 50200 | 0.4808 | 0.1623 | 0.3246 | 0.3246 | nan | 0.3246 | 0.0 | 0.3246 |
| 0.0556 | 252.0 | 50400 | 0.4696 | 0.1831 | 0.3663 | 0.3663 | nan | 0.3663 | 0.0 | 0.3663 |
| 0.0673 | 253.0 | 50600 | 0.4599 | 0.1897 | 0.3793 | 0.3793 | nan | 0.3793 | 0.0 | 0.3793 |
| 0.0696 | 254.0 | 50800 | 0.4847 | 0.1684 | 0.3367 | 0.3367 | nan | 0.3367 | 0.0 | 0.3367 |
| 0.0624 | 255.0 | 51000 | 0.5151 | 0.1586 | 0.3172 | 0.3172 | nan | 0.3172 | 0.0 | 0.3172 |
| 0.0697 | 256.0 | 51200 | 0.5099 | 0.1585 | 0.3169 | 0.3169 | nan | 0.3169 | 0.0 | 0.3169 |
| 0.0613 | 257.0 | 51400 | 0.4962 | 0.1693 | 0.3385 | 0.3385 | nan | 0.3385 | 0.0 | 0.3385 |
| 0.0641 | 258.0 | 51600 | 0.5652 | 0.1417 | 0.2834 | 0.2834 | nan | 0.2834 | 0.0 | 0.2834 |
| 0.0648 | 259.0 | 51800 | 0.5027 | 0.1554 | 0.3108 | 0.3108 | nan | 0.3108 | 0.0 | 0.3108 |
| 0.0617 | 260.0 | 52000 | 0.4845 | 0.1856 | 0.3712 | 0.3712 | nan | 0.3712 | 0.0 | 0.3712 |
| 0.0662 | 261.0 | 52200 | 0.4792 | 0.1738 | 0.3477 | 0.3477 | nan | 0.3477 | 0.0 | 0.3477 |
| 0.0701 | 262.0 | 52400 | 0.5169 | 0.1763 | 0.3526 | 0.3526 | nan | 0.3526 | 0.0 | 0.3526 |
| 0.0718 | 263.0 | 52600 | 0.4892 | 0.1725 | 0.3449 | 0.3449 | nan | 0.3449 | 0.0 | 0.3449 |
| 0.0681 | 264.0 | 52800 | 0.4890 | 0.1795 | 0.3589 | 0.3589 | nan | 0.3589 | 0.0 | 0.3589 |
| 0.0663 | 265.0 | 53000 | 0.5337 | 0.1511 | 0.3023 | 0.3023 | nan | 0.3023 | 0.0 | 0.3023 |
| 0.0732 | 266.0 | 53200 | 0.5129 | 0.1678 | 0.3355 | 0.3355 | nan | 0.3355 | 0.0 | 0.3355 |
| 0.0667 | 267.0 | 53400 | 0.5392 | 0.1557 | 0.3113 | 0.3113 | nan | 0.3113 | 0.0 | 0.3113 |
| 0.0655 | 268.0 | 53600 | 0.5116 | 0.1395 | 0.2789 | 0.2789 | nan | 0.2789 | 0.0 | 0.2789 |
| 0.0712 | 269.0 | 53800 | 0.4868 | 0.1745 | 0.3491 | 0.3491 | nan | 0.3491 | 0.0 | 0.3491 |
| 0.0613 | 270.0 | 54000 | 0.5079 | 0.1602 | 0.3204 | 0.3204 | nan | 0.3204 | 0.0 | 0.3204 |
| 0.0638 | 271.0 | 54200 | 0.5035 | 0.1608 | 0.3216 | 0.3216 | nan | 0.3216 | 0.0 | 0.3216 |
| 0.07 | 272.0 | 54400 | 0.4959 | 0.1649 | 0.3298 | 0.3298 | nan | 0.3298 | 0.0 | 0.3298 |
| 0.0605 | 273.0 | 54600 | 0.5098 | 0.1634 | 0.3269 | 0.3269 | nan | 0.3269 | 0.0 | 0.3269 |
| 0.0766 | 274.0 | 54800 | 0.4912 | 0.1731 | 0.3463 | 0.3463 | nan | 0.3463 | 0.0 | 0.3463 |
| 0.0703 | 275.0 | 55000 | 0.4843 | 0.1601 | 0.3201 | 0.3201 | nan | 0.3201 | 0.0 | 0.3201 |
| 0.0581 | 276.0 | 55200 | 0.4665 | 0.1640 | 0.3280 | 0.3280 | nan | 0.3280 | 0.0 | 0.3280 |
| 0.0699 | 277.0 | 55400 | 0.4574 | 0.1776 | 0.3552 | 0.3552 | nan | 0.3552 | 0.0 | 0.3552 |
| 0.058 | 278.0 | 55600 | 0.4847 | 0.1756 | 0.3511 | 0.3511 | nan | 0.3511 | 0.0 | 0.3511 |
| 0.0686 | 279.0 | 55800 | 0.5274 | 0.1528 | 0.3055 | 0.3055 | nan | 0.3055 | 0.0 | 0.3055 |
| 0.0576 | 280.0 | 56000 | 0.4540 | 0.1798 | 0.3596 | 0.3596 | nan | 0.3596 | 0.0 | 0.3596 |
| 0.0559 | 281.0 | 56200 | 0.5078 | 0.1677 | 0.3354 | 0.3354 | nan | 0.3354 | 0.0 | 0.3354 |
| 0.0604 | 282.0 | 56400 | 0.4987 | 0.1665 | 0.3330 | 0.3330 | nan | 0.3330 | 0.0 | 0.3330 |
| 0.0639 | 283.0 | 56600 | 0.4881 | 0.1606 | 0.3213 | 0.3213 | nan | 0.3213 | 0.0 | 0.3213 |
| 0.0645 | 284.0 | 56800 | 0.4821 | 0.1753 | 0.3506 | 0.3506 | nan | 0.3506 | 0.0 | 0.3506 |
| 0.06 | 285.0 | 57000 | 0.5155 | 0.1660 | 0.3320 | 0.3320 | nan | 0.3320 | 0.0 | 0.3320 |
| 0.065 | 286.0 | 57200 | 0.4960 | 0.1695 | 0.3390 | 0.3390 | nan | 0.3390 | 0.0 | 0.3390 |
| 0.0606 | 287.0 | 57400 | 0.4813 | 0.1741 | 0.3481 | 0.3481 | nan | 0.3481 | 0.0 | 0.3481 |
| 0.0628 | 288.0 | 57600 | 0.5220 | 0.1678 | 0.3356 | 0.3356 | nan | 0.3356 | 0.0 | 0.3356 |
| 0.0607 | 289.0 | 57800 | 0.4892 | 0.1788 | 0.3577 | 0.3577 | nan | 0.3577 | 0.0 | 0.3577 |
| 0.0585 | 290.0 | 58000 | 0.5029 | 0.1613 | 0.3227 | 0.3227 | nan | 0.3227 | 0.0 | 0.3227 |
| 0.0532 | 291.0 | 58200 | 0.4909 | 0.1689 | 0.3378 | 0.3378 | nan | 0.3378 | 0.0 | 0.3378 |
| 0.0575 | 292.0 | 58400 | 0.4877 | 0.1649 | 0.3299 | 0.3299 | nan | 0.3299 | 0.0 | 0.3299 |
| 0.0573 | 293.0 | 58600 | 0.5360 | 0.1580 | 0.3161 | 0.3161 | nan | 0.3161 | 0.0 | 0.3161 |
| 0.0602 | 294.0 | 58800 | 0.5095 | 0.1615 | 0.3230 | 0.3230 | nan | 0.3230 | 0.0 | 0.3230 |
| 0.0646 | 295.0 | 59000 | 0.5142 | 0.1614 | 0.3228 | 0.3228 | nan | 0.3228 | 0.0 | 0.3228 |
| 0.0583 | 296.0 | 59200 | 0.4907 | 0.1635 | 0.3270 | 0.3270 | nan | 0.3270 | 0.0 | 0.3270 |
| 0.0595 | 297.0 | 59400 | 0.5081 | 0.1619 | 0.3238 | 0.3238 | nan | 0.3238 | 0.0 | 0.3238 |
| 0.0664 | 298.0 | 59600 | 0.5236 | 0.1608 | 0.3217 | 0.3217 | nan | 0.3217 | 0.0 | 0.3217 |
| 0.0675 | 299.0 | 59800 | 0.5317 | 0.1611 | 0.3222 | 0.3222 | nan | 0.3222 | 0.0 | 0.3222 |
| 0.0619 | 300.0 | 60000 | 0.4916 | 0.1667 | 0.3335 | 0.3335 | nan | 0.3335 | 0.0 | 0.3335 |
### Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"normal",
"hallucination"
] |
jenniferlumeng/umaxtools-b0-v01-finetuned-segments-outputs |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# umaxtools-b0-v01-finetuned-segments-outputs
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the jenniferlumeng/umaxtools dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4120
- Mean Iou: 0.4577
- Mean Accuracy: 0.9154
- Overall Accuracy: 0.9154
- Accuracy Background: nan
- Accuracy Object: 0.9154
- Iou Background: 0.0
- Iou Object: 0.9154
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Object | Iou Background | Iou Object |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------:|
| 0.4422 | 5.0 | 10 | 0.5913 | 0.4820 | 0.9640 | 0.9640 | nan | 0.9640 | 0.0 | 0.9640 |
| 0.3684 | 10.0 | 20 | 0.5141 | 0.4821 | 0.9643 | 0.9643 | nan | 0.9643 | 0.0 | 0.9643 |
| 0.294 | 15.0 | 30 | 0.4940 | 0.4325 | 0.8650 | 0.8650 | nan | 0.8650 | 0.0 | 0.8650 |
| 0.2614 | 20.0 | 40 | 0.4563 | 0.4451 | 0.8901 | 0.8901 | nan | 0.8901 | 0.0 | 0.8901 |
| 0.2981 | 25.0 | 50 | 0.4381 | 0.4500 | 0.8999 | 0.8999 | nan | 0.8999 | 0.0 | 0.8999 |
| 0.245 | 30.0 | 60 | 0.4189 | 0.4588 | 0.9176 | 0.9176 | nan | 0.9176 | 0.0 | 0.9176 |
| 0.2356 | 35.0 | 70 | 0.4185 | 0.4502 | 0.9004 | 0.9004 | nan | 0.9004 | 0.0 | 0.9004 |
| 0.2228 | 40.0 | 80 | 0.4165 | 0.4583 | 0.9166 | 0.9166 | nan | 0.9166 | 0.0 | 0.9166 |
| 0.2162 | 45.0 | 90 | 0.4128 | 0.4613 | 0.9226 | 0.9226 | nan | 0.9226 | 0.0 | 0.9226 |
| 0.2288 | 50.0 | 100 | 0.4120 | 0.4577 | 0.9154 | 0.9154 | nan | 0.9154 | 0.0 | 0.9154 |
### Framework versions
- Transformers 4.52.2
- Pytorch 2.6.0+cu124
- Datasets 2.16.1
- Tokenizers 0.21.1
| [
"background",
"object"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.