segformer-b5-finetuned-ade20k-hgo-coord_40epochs_distortion_ver2_global_norm_with_void_4
This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the NICOPOI-9/Modphad_Perlin_two_void_coord_global_norm dataset. It achieves the following results on the evaluation set:
- Loss: 0.6653
- Mean Iou: 0.7725
- Mean Accuracy: 0.8702
- Overall Accuracy: 0.8824
- Accuracy [0,0]: 0.8550
- Accuracy [0,1]: 0.8883
- Accuracy [1,0]: 0.9019
- Accuracy [1,1]: 0.8817
- Accuracy [0,2]: 0.8976
- Accuracy [0,3]: 0.9033
- Accuracy [1,2]: 0.8715
- Accuracy [1,3]: 0.9091
- Accuracy [2,0]: 0.8286
- Accuracy [2,1]: 0.8755
- Accuracy [2,2]: 0.8668
- Accuracy [2,3]: 0.8119
- Accuracy [3,0]: 0.8624
- Accuracy [3,1]: 0.7922
- Accuracy [3,2]: 0.8500
- Accuracy [3,3]: 0.8287
- Accuracy Void: 0.9695
- Iou [0,0]: 0.7906
- Iou [0,1]: 0.8047
- Iou [1,0]: 0.7816
- Iou [1,1]: 0.8141
- Iou [0,2]: 0.8098
- Iou [0,3]: 0.7654
- Iou [1,2]: 0.7771
- Iou [1,3]: 0.7698
- Iou [2,0]: 0.7262
- Iou [2,1]: 0.7632
- Iou [2,2]: 0.7299
- Iou [2,3]: 0.7208
- Iou [3,0]: 0.7854
- Iou [3,1]: 0.7184
- Iou [3,2]: 0.7428
- Iou [3,3]: 0.7067
- Iou Void: 0.9263
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 160
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy [0,0] | Accuracy [0,1] | Accuracy [1,0] | Accuracy [1,1] | Accuracy [0,2] | Accuracy [0,3] | Accuracy [1,2] | Accuracy [1,3] | Accuracy [2,0] | Accuracy [2,1] | Accuracy [2,2] | Accuracy [2,3] | Accuracy [3,0] | Accuracy [3,1] | Accuracy [3,2] | Accuracy [3,3] | Accuracy Void | Iou [0,0] | Iou [0,1] | Iou [1,0] | Iou [1,1] | Iou [0,2] | Iou [0,3] | Iou [1,2] | Iou [1,3] | Iou [2,0] | Iou [2,1] | Iou [2,2] | Iou [2,3] | Iou [3,0] | Iou [3,1] | Iou [3,2] | Iou [3,3] | Iou Void |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.1886 | 7.3260 | 4000 | 1.2673 | 0.3996 | 0.5640 | 0.6016 | 0.4741 | 0.5454 | 0.5928 | 0.6399 | 0.5156 | 0.4641 | 0.4718 | 0.5480 | 0.4585 | 0.5101 | 0.5382 | 0.5261 | 0.6719 | 0.5590 | 0.4871 | 0.6623 | 0.9223 | 0.4093 | 0.4143 | 0.3984 | 0.4226 | 0.4037 | 0.3569 | 0.3321 | 0.4267 | 0.3569 | 0.3466 | 0.3401 | 0.3917 | 0.4129 | 0.3384 | 0.3123 | 0.2921 | 0.8383 |
1.2869 | 14.6520 | 8000 | 0.9750 | 0.5116 | 0.6709 | 0.7020 | 0.6675 | 0.6769 | 0.7363 | 0.7205 | 0.6219 | 0.6666 | 0.4876 | 0.7955 | 0.6715 | 0.6206 | 0.5169 | 0.7125 | 0.7610 | 0.4896 | 0.6361 | 0.7068 | 0.9174 | 0.5488 | 0.5559 | 0.5446 | 0.4579 | 0.5345 | 0.5201 | 0.4101 | 0.5248 | 0.4857 | 0.4404 | 0.4303 | 0.4828 | 0.5633 | 0.4133 | 0.5004 | 0.4308 | 0.8539 |
1.2291 | 21.9780 | 12000 | 0.8046 | 0.5963 | 0.7469 | 0.7658 | 0.7366 | 0.7485 | 0.7818 | 0.7346 | 0.7464 | 0.8049 | 0.6572 | 0.8151 | 0.6797 | 0.7084 | 0.7394 | 0.7938 | 0.7353 | 0.6297 | 0.7787 | 0.7009 | 0.9068 | 0.6130 | 0.6405 | 0.6005 | 0.6198 | 0.6201 | 0.5553 | 0.5344 | 0.6296 | 0.5199 | 0.5460 | 0.4700 | 0.6153 | 0.6217 | 0.5344 | 0.5849 | 0.5573 | 0.8750 |
0.3104 | 29.3040 | 16000 | 0.6399 | 0.6582 | 0.7930 | 0.8091 | 0.7850 | 0.8352 | 0.8091 | 0.8373 | 0.8137 | 0.8251 | 0.7227 | 0.7906 | 0.8229 | 0.7740 | 0.7435 | 0.8859 | 0.8129 | 0.6392 | 0.7057 | 0.7630 | 0.9157 | 0.6952 | 0.6891 | 0.6964 | 0.6733 | 0.6595 | 0.6223 | 0.6479 | 0.6870 | 0.6201 | 0.6079 | 0.6430 | 0.6449 | 0.6967 | 0.5473 | 0.5996 | 0.5790 | 0.8810 |
0.3486 | 36.6300 | 20000 | 0.6188 | 0.6709 | 0.8015 | 0.8193 | 0.7634 | 0.8269 | 0.8616 | 0.8582 | 0.8198 | 0.8505 | 0.6733 | 0.8336 | 0.8299 | 0.8104 | 0.7407 | 0.8566 | 0.7730 | 0.6485 | 0.7307 | 0.8084 | 0.9401 | 0.7015 | 0.7169 | 0.6902 | 0.7149 | 0.6658 | 0.6510 | 0.6201 | 0.6931 | 0.6274 | 0.6539 | 0.6000 | 0.6483 | 0.6773 | 0.5972 | 0.6464 | 0.6074 | 0.8940 |
0.2437 | 43.9560 | 24000 | 0.6233 | 0.6775 | 0.8026 | 0.8251 | 0.8422 | 0.8838 | 0.8424 | 0.8420 | 0.8351 | 0.8449 | 0.6714 | 0.8251 | 0.8146 | 0.8303 | 0.7672 | 0.8111 | 0.7900 | 0.6462 | 0.7008 | 0.7325 | 0.9644 | 0.7360 | 0.6873 | 0.6956 | 0.7231 | 0.6931 | 0.6536 | 0.6321 | 0.7337 | 0.6171 | 0.6415 | 0.5926 | 0.6804 | 0.7243 | 0.5894 | 0.5893 | 0.6230 | 0.9052 |
0.1864 | 51.2821 | 28000 | 0.5680 | 0.7150 | 0.8333 | 0.8473 | 0.8205 | 0.8365 | 0.8748 | 0.8464 | 0.8739 | 0.8255 | 0.8341 | 0.8842 | 0.7700 | 0.7983 | 0.8625 | 0.7927 | 0.8406 | 0.7202 | 0.8199 | 0.8125 | 0.9538 | 0.7419 | 0.7433 | 0.7285 | 0.7755 | 0.6894 | 0.6936 | 0.7199 | 0.7695 | 0.6154 | 0.6969 | 0.6473 | 0.6570 | 0.7580 | 0.6683 | 0.6683 | 0.6762 | 0.9059 |
0.1692 | 58.6081 | 32000 | 0.5921 | 0.7288 | 0.8426 | 0.8558 | 0.8258 | 0.8749 | 0.8927 | 0.8481 | 0.8773 | 0.8747 | 0.8187 | 0.8771 | 0.7955 | 0.8649 | 0.7956 | 0.7949 | 0.8335 | 0.7759 | 0.8405 | 0.7863 | 0.9475 | 0.7648 | 0.7547 | 0.7451 | 0.7684 | 0.7623 | 0.7160 | 0.7242 | 0.7323 | 0.6573 | 0.6880 | 0.6486 | 0.7025 | 0.7500 | 0.6799 | 0.7119 | 0.6784 | 0.9057 |
0.4861 | 65.9341 | 36000 | 0.5194 | 0.7383 | 0.8482 | 0.8616 | 0.8336 | 0.8530 | 0.8778 | 0.8545 | 0.8688 | 0.8927 | 0.8369 | 0.8942 | 0.8213 | 0.8737 | 0.8223 | 0.8568 | 0.8525 | 0.6965 | 0.8116 | 0.8126 | 0.9609 | 0.7682 | 0.7622 | 0.7594 | 0.7796 | 0.7457 | 0.6981 | 0.7502 | 0.7548 | 0.6797 | 0.7048 | 0.6785 | 0.7433 | 0.7979 | 0.6522 | 0.7041 | 0.6543 | 0.9186 |
0.0915 | 73.2601 | 40000 | 0.5566 | 0.7394 | 0.8480 | 0.8621 | 0.8206 | 0.8965 | 0.9048 | 0.8691 | 0.8445 | 0.8811 | 0.8250 | 0.9031 | 0.8086 | 0.8207 | 0.8112 | 0.8027 | 0.8587 | 0.7725 | 0.8267 | 0.8175 | 0.9533 | 0.7485 | 0.7813 | 0.7311 | 0.7713 | 0.7492 | 0.7206 | 0.7520 | 0.7441 | 0.6908 | 0.7191 | 0.7050 | 0.7131 | 0.7688 | 0.6863 | 0.6970 | 0.6761 | 0.9154 |
0.077 | 80.5861 | 44000 | 0.5688 | 0.7463 | 0.8535 | 0.8664 | 0.8592 | 0.8755 | 0.9036 | 0.8583 | 0.8760 | 0.8869 | 0.8099 | 0.9010 | 0.8338 | 0.8629 | 0.7998 | 0.8509 | 0.8282 | 0.7651 | 0.8461 | 0.8025 | 0.9504 | 0.7777 | 0.7797 | 0.7702 | 0.7893 | 0.7733 | 0.7193 | 0.7441 | 0.7597 | 0.6706 | 0.7043 | 0.6729 | 0.7524 | 0.7556 | 0.7023 | 0.7405 | 0.6601 | 0.9144 |
0.157 | 87.9121 | 48000 | 0.5899 | 0.7461 | 0.8530 | 0.8667 | 0.8567 | 0.8936 | 0.9126 | 0.8858 | 0.8789 | 0.8671 | 0.8358 | 0.8843 | 0.7829 | 0.8759 | 0.8621 | 0.7755 | 0.8669 | 0.7841 | 0.7996 | 0.7827 | 0.9564 | 0.7788 | 0.7744 | 0.7384 | 0.7894 | 0.7758 | 0.7410 | 0.7388 | 0.7349 | 0.6856 | 0.7261 | 0.7241 | 0.7141 | 0.7745 | 0.6944 | 0.7012 | 0.6713 | 0.9202 |
0.1121 | 95.2381 | 52000 | 0.5786 | 0.7497 | 0.8572 | 0.8687 | 0.7989 | 0.8786 | 0.9104 | 0.8817 | 0.8724 | 0.8860 | 0.8292 | 0.8782 | 0.8114 | 0.8692 | 0.8686 | 0.8451 | 0.8437 | 0.8010 | 0.8048 | 0.8363 | 0.9572 | 0.7612 | 0.7862 | 0.7810 | 0.7833 | 0.7666 | 0.7264 | 0.7541 | 0.7763 | 0.7090 | 0.7208 | 0.6813 | 0.7268 | 0.7698 | 0.6926 | 0.6978 | 0.6915 | 0.9200 |
0.1639 | 102.5641 | 56000 | 0.6080 | 0.7492 | 0.8558 | 0.8690 | 0.8640 | 0.8562 | 0.8978 | 0.8556 | 0.8780 | 0.8913 | 0.8356 | 0.8889 | 0.8292 | 0.8292 | 0.8665 | 0.8356 | 0.8422 | 0.7435 | 0.8396 | 0.8296 | 0.9651 | 0.8072 | 0.7678 | 0.7487 | 0.7867 | 0.7676 | 0.7490 | 0.7302 | 0.7669 | 0.6984 | 0.7091 | 0.6520 | 0.7207 | 0.7901 | 0.6841 | 0.7341 | 0.7021 | 0.9220 |
0.1274 | 109.8901 | 60000 | 0.5982 | 0.7551 | 0.8589 | 0.8722 | 0.8467 | 0.8706 | 0.9042 | 0.8514 | 0.8906 | 0.9028 | 0.8519 | 0.9049 | 0.7823 | 0.8592 | 0.8388 | 0.8417 | 0.8580 | 0.7620 | 0.8412 | 0.8282 | 0.9673 | 0.7838 | 0.7805 | 0.7717 | 0.8013 | 0.7760 | 0.7264 | 0.7607 | 0.7828 | 0.6788 | 0.7438 | 0.6709 | 0.7439 | 0.7875 | 0.7008 | 0.7342 | 0.6706 | 0.9241 |
0.0471 | 117.2161 | 64000 | 0.6311 | 0.7516 | 0.8551 | 0.8701 | 0.8204 | 0.8726 | 0.9208 | 0.8911 | 0.8795 | 0.8946 | 0.8237 | 0.9084 | 0.8002 | 0.8610 | 0.8294 | 0.8125 | 0.8272 | 0.7370 | 0.8589 | 0.8228 | 0.9765 | 0.7672 | 0.7862 | 0.7681 | 0.7937 | 0.7888 | 0.7340 | 0.7439 | 0.7461 | 0.6789 | 0.7402 | 0.7042 | 0.7235 | 0.7818 | 0.6896 | 0.7305 | 0.6793 | 0.9216 |
0.1196 | 124.5421 | 68000 | 0.6434 | 0.7574 | 0.8591 | 0.8729 | 0.8381 | 0.8515 | 0.9121 | 0.8759 | 0.8960 | 0.9228 | 0.8405 | 0.9020 | 0.8199 | 0.8498 | 0.8307 | 0.8084 | 0.8578 | 0.7502 | 0.8544 | 0.8232 | 0.9713 | 0.7775 | 0.7835 | 0.7660 | 0.7959 | 0.7943 | 0.7476 | 0.7604 | 0.7338 | 0.7099 | 0.7565 | 0.7205 | 0.7249 | 0.7782 | 0.6936 | 0.7448 | 0.6688 | 0.9201 |
0.0608 | 131.8681 | 72000 | 0.6561 | 0.7643 | 0.8649 | 0.8778 | 0.8422 | 0.8783 | 0.9092 | 0.8826 | 0.9052 | 0.8930 | 0.8790 | 0.8970 | 0.7703 | 0.8836 | 0.8538 | 0.8109 | 0.8547 | 0.7824 | 0.8676 | 0.8203 | 0.9735 | 0.7761 | 0.8057 | 0.7746 | 0.8099 | 0.7989 | 0.7626 | 0.7788 | 0.7617 | 0.6689 | 0.7437 | 0.7134 | 0.7253 | 0.7774 | 0.7210 | 0.7450 | 0.7057 | 0.9240 |
0.3017 | 139.1941 | 76000 | 0.6601 | 0.7695 | 0.8682 | 0.8806 | 0.8440 | 0.8853 | 0.9105 | 0.8975 | 0.8843 | 0.8977 | 0.8319 | 0.9032 | 0.8348 | 0.8918 | 0.8607 | 0.7982 | 0.8622 | 0.7801 | 0.8673 | 0.8393 | 0.9705 | 0.7834 | 0.7974 | 0.7818 | 0.8137 | 0.7982 | 0.7578 | 0.7558 | 0.7736 | 0.7101 | 0.7564 | 0.7204 | 0.7175 | 0.7931 | 0.7169 | 0.7660 | 0.7131 | 0.9255 |
0.0676 | 146.5201 | 80000 | 0.6446 | 0.7707 | 0.8687 | 0.8815 | 0.8372 | 0.8897 | 0.9133 | 0.8882 | 0.9012 | 0.8940 | 0.8804 | 0.9149 | 0.8135 | 0.8804 | 0.8601 | 0.7968 | 0.8520 | 0.7825 | 0.8531 | 0.8347 | 0.9754 | 0.7825 | 0.8057 | 0.7876 | 0.8081 | 0.8047 | 0.7625 | 0.7868 | 0.7803 | 0.7166 | 0.7450 | 0.7153 | 0.7174 | 0.7854 | 0.7197 | 0.7427 | 0.7183 | 0.9234 |
0.0312 | 153.8462 | 84000 | 0.6653 | 0.7725 | 0.8702 | 0.8824 | 0.8550 | 0.8883 | 0.9019 | 0.8817 | 0.8976 | 0.9033 | 0.8715 | 0.9091 | 0.8286 | 0.8755 | 0.8668 | 0.8119 | 0.8624 | 0.7922 | 0.8500 | 0.8287 | 0.9695 | 0.7906 | 0.8047 | 0.7816 | 0.8141 | 0.8098 | 0.7654 | 0.7771 | 0.7698 | 0.7262 | 0.7632 | 0.7299 | 0.7208 | 0.7854 | 0.7184 | 0.7428 | 0.7067 | 0.9263 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 30