50d_seg_model_20250507
This model is a fine-tuned version of nvidia/mit-b0 on the TommyClas/50d_seg_20250505 dataset. It achieves the following results on the evaluation set:
- Loss: 0.8496
- Mean Iou: 0.4618
- Mean Accuracy: 0.6974
- Overall Accuracy: 0.7722
- Accuracy 背景: nan
- Accuracy å”éš™: 0.7921
- Accuracy Ld c-s-h: 0.7918
- Accuracy Hd c-s-h: 0.3186
- Accuracy 未水化水泥颗粒: 0.8868
- Iou 背景: 0.0
- Iou å”éš™: 0.6924
- Iou Ld c-s-h: 0.5935
- Iou Hd c-s-h: 0.2372
- Iou 未水化水泥颗粒: 0.7862
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 10000
Training results
Training Loss | Epoch | Step | Accuracy Hd c-s-h | Accuracy Ld c-s-h | Accuracy å”éš™ | Accuracy 未水化水泥颗粒 | Accuracy 背景 | Iou Hd c-s-h | Iou Ld c-s-h | Iou å”éš™ | Iou 未水化水泥颗粒 | Iou 背景 | Validation Loss | Mean Accuracy | Mean Iou | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.0019 | 1.0 | 250 | 0.0 | 0.0650 | 0.7441 | 0.4815 | nan | 0.0 | 0.0544 | 0.3893 | 0.4701 | 0.0 | 4.4151 | 0.3226 | 0.1828 | 0.3786 |
0.4554 | 2.0 | 500 | 0.0000 | 0.6842 | 0.7487 | 0.8724 | nan | 0.0000 | 0.5197 | 0.5813 | 0.7626 | 0.0 | 1.2340 | 0.5763 | 0.3727 | 0.6884 |
0.4529 | 3.0 | 750 | 0.0000 | 0.7384 | 0.7863 | 0.8553 | nan | 0.0000 | 0.5339 | 0.6291 | 0.7592 | 0.0 | 0.9681 | 0.5950 | 0.3844 | 0.7172 |
0.4248 | 4.0 | 1000 | 0.2327 | 0.5245 | 0.8565 | 0.8152 | nan | 0.1599 | 0.4278 | 0.5931 | 0.7471 | 0.0 | 1.0964 | 0.6072 | 0.3856 | 0.6744 |
0.4194 | 5.0 | 1250 | 0.2258 | 0.7465 | 0.4263 | 0.9194 | nan | 0.1022 | 0.4743 | 0.4199 | 0.7475 | 0.0 | 0.9903 | 0.5795 | 0.3488 | 0.6308 |
0.4183 | 6.0 | 1500 | 0.5113 | 0.4379 | 0.9017 | 0.7679 | nan | 0.1996 | 0.3818 | 0.6551 | 0.7309 | 0.0 | 0.9263 | 0.6547 | 0.3935 | 0.6717 |
0.4178 | 7.0 | 1750 | 0.2577 | 0.5691 | 0.9078 | 0.8036 | nan | 0.1468 | 0.4584 | 0.6702 | 0.7487 | 0.0 | 0.8280 | 0.6346 | 0.4048 | 0.7077 |
0.4144 | 8.0 | 2000 | 0.1812 | 0.6915 | 0.6035 | 0.9596 | nan | 0.1060 | 0.4844 | 0.5706 | 0.6832 | 0.0 | 0.9423 | 0.6090 | 0.3688 | 0.6757 |
0.4148 | 9.0 | 2250 | 0.3482 | 0.1554 | 0.9229 | 0.5755 | nan | 0.1635 | 0.1371 | 0.4818 | 0.5680 | 0.0 | 1.7458 | 0.5005 | 0.2701 | 0.5215 |
0.4051 | 10.0 | 2500 | 0.0512 | 0.7194 | 0.8609 | 0.7365 | nan | 0.0456 | 0.5033 | 0.6617 | 0.6965 | 0.0 | 0.8936 | 0.5920 | 0.3814 | 0.7146 |
0.4027 | 11.0 | 2750 | 0.1803 | 0.5811 | 0.9251 | 0.8497 | nan | 0.1368 | 0.4711 | 0.6614 | 0.7522 | 0.0 | 0.8490 | 0.6340 | 0.4043 | 0.7212 |
0.4 | 12.0 | 3000 | 0.0871 | 0.7090 | 0.8824 | 0.8546 | nan | 0.0746 | 0.5374 | 0.6926 | 0.7589 | 0.0 | 0.7884 | 0.6333 | 0.4127 | 0.7463 |
0.3966 | 13.0 | 3250 | 0.3821 | 0.7329 | 0.8081 | 0.7223 | nan | 0.2117 | 0.5417 | 0.6819 | 0.6861 | 0.0 | 0.7998 | 0.6614 | 0.4243 | 0.7265 |
0.3809 | 14.0 | 3500 | 0.1311 | 0.6534 | 0.9270 | 0.8173 | nan | 0.1055 | 0.5109 | 0.6812 | 0.7604 | 0.0 | 0.8045 | 0.6322 | 0.4116 | 0.7370 |
0.3769 | 15.0 | 3750 | 0.2447 | 0.7358 | 0.3770 | 0.9680 | nan | 0.1478 | 0.4557 | 0.3679 | 0.6794 | 0.0 | 1.0931 | 0.5814 | 0.3301 | 0.6222 |
0.3742 | 16.0 | 4000 | 0.0718 | 0.6090 | 0.9529 | 0.8171 | nan | 0.0647 | 0.4802 | 0.6614 | 0.7576 | 0.0 | 0.8481 | 0.6127 | 0.3928 | 0.7246 |
0.3723 | 17.0 | 4250 | 0.1708 | 0.7345 | 0.8772 | 0.8962 | nan | 0.1393 | 0.5764 | 0.7142 | 0.7782 | 0.0 | 0.6791 | 0.6697 | 0.4416 | 0.7698 |
0.369 | 18.0 | 4500 | 0.0727 | 0.7378 | 0.9273 | 0.7286 | nan | 0.0679 | 0.5349 | 0.7077 | 0.7059 | 0.0 | 0.7373 | 0.6166 | 0.4033 | 0.7440 |
0.3812 | 19.0 | 4750 | 0.1705 | 0.8129 | 0.8411 | 0.8514 | nan | 0.1543 | 0.5980 | 0.7109 | 0.7786 | 0.0 | 0.6597 | 0.6690 | 0.4484 | 0.7764 |
0.3686 | 20.0 | 5000 | 0.1371 | 0.7812 | 0.8773 | 0.8351 | nan | 0.1248 | 0.5836 | 0.7143 | 0.7709 | 0.0 | 0.7417 | 0.6577 | 0.4387 | 0.7709 |
0.3675 | 21.0 | 5250 | 0.2222 | 0.7268 | 0.9023 | 0.8715 | nan | 0.1882 | 0.5762 | 0.7135 | 0.7853 | 0.0 | 0.7173 | 0.6807 | 0.4526 | 0.7746 |
0.3647 | 22.0 | 5500 | 0.2299 | 0.7688 | 0.8603 | 0.8955 | nan | 0.1904 | 0.5974 | 0.7207 | 0.7851 | 0.0 | 0.7049 | 0.6886 | 0.4587 | 0.7813 |
0.3629 | 23.0 | 5750 | 0.2192 | 0.6911 | 0.9342 | 0.8092 | nan | 0.1833 | 0.5437 | 0.7023 | 0.7632 | 0.0 | 0.8170 | 0.6634 | 0.4385 | 0.7589 |
0.3623 | 24.0 | 6000 | 0.2782 | 0.7554 | 0.8817 | 0.8655 | nan | 0.2243 | 0.5923 | 0.7212 | 0.7869 | 0.0 | 0.7217 | 0.6952 | 0.4649 | 0.7814 |
0.3609 | 25.0 | 6250 | 0.2384 | 0.7263 | 0.9150 | 0.8090 | nan | 0.1964 | 0.5627 | 0.7133 | 0.7635 | 0.0 | 0.8099 | 0.6722 | 0.4472 | 0.7667 |
0.3619 | 26.0 | 6500 | 0.2477 | 0.8017 | 0.8433 | 0.8570 | nan | 0.2032 | 0.6010 | 0.7189 | 0.7839 | 0.0 | 0.7394 | 0.6874 | 0.4614 | 0.7808 |
0.3603 | 27.0 | 6750 | 0.2353 | 0.8035 | 0.8323 | 0.8729 | nan | 0.1923 | 0.6014 | 0.7161 | 0.7879 | 0.0 | 0.7378 | 0.6860 | 0.4595 | 0.7801 |
0.3591 | 28.0 | 7000 | 0.2726 | 0.7587 | 0.8749 | 0.8612 | nan | 0.2141 | 0.5885 | 0.7218 | 0.7867 | 0.0 | 0.7468 | 0.6918 | 0.4622 | 0.7789 |
0.3584 | 29.0 | 7250 | 0.7495 | 0.4603 | 0.6958 | 0.7760 | nan | 0.7940 | 0.7926 | 0.2814 | 0.9150 | 0.0 | 0.7036 | 0.5987 | 0.2133 | 0.7861 |
0.3564 | 30.0 | 7500 | 0.7896 | 0.4520 | 0.6803 | 0.7733 | nan | 0.7788 | 0.8223 | 0.2201 | 0.9001 | 0.0 | 0.6955 | 0.6001 | 0.1781 | 0.7865 |
0.3567 | 31.0 | 7750 | 0.7667 | 0.4590 | 0.6863 | 0.7810 | nan | 0.8533 | 0.7825 | 0.2269 | 0.8827 | 0.0 | 0.7195 | 0.5977 | 0.1887 | 0.7891 |
0.3551 | 32.0 | 8000 | 0.7786 | 0.4616 | 0.6945 | 0.7776 | nan | 0.7922 | 0.8068 | 0.2755 | 0.9036 | 0.0 | 0.7005 | 0.6029 | 0.2179 | 0.7866 |
0.3539 | 33.0 | 8250 | 0.7576 | 0.4644 | 0.6960 | 0.7801 | nan | 0.8306 | 0.7869 | 0.2804 | 0.8862 | 0.0 | 0.7122 | 0.5988 | 0.2222 | 0.7889 |
0.354 | 34.0 | 8500 | 0.7926 | 0.4632 | 0.6966 | 0.7774 | nan | 0.7967 | 0.8027 | 0.2899 | 0.8969 | 0.0 | 0.7016 | 0.6007 | 0.2255 | 0.7881 |
0.3519 | 35.0 | 8750 | 0.7779 | 0.4626 | 0.6919 | 0.7771 | nan | 0.8519 | 0.7782 | 0.2869 | 0.8505 | 0.0 | 0.7137 | 0.5914 | 0.2252 | 0.7825 |
0.3511 | 36.0 | 9000 | 0.8104 | 0.4638 | 0.6987 | 0.7753 | nan | 0.7967 | 0.7967 | 0.3138 | 0.8878 | 0.0 | 0.6982 | 0.5980 | 0.2358 | 0.7873 |
0.3505 | 37.0 | 9250 | 0.8228 | 0.4613 | 0.6954 | 0.7735 | nan | 0.7830 | 0.8048 | 0.3016 | 0.8922 | 0.0 | 0.6916 | 0.5975 | 0.2307 | 0.7868 |
0.3499 | 38.0 | 9500 | 0.8103 | 0.4617 | 0.6939 | 0.7752 | nan | 0.8289 | 0.7767 | 0.2915 | 0.8786 | 0.0 | 0.7023 | 0.5905 | 0.2290 | 0.7869 |
0.3493 | 39.0 | 9750 | 0.8413 | 0.4616 | 0.6951 | 0.7733 | nan | 0.8075 | 0.7885 | 0.3065 | 0.8778 | 0.0 | 0.6970 | 0.5925 | 0.2325 | 0.7863 |
0.3481 | 40.0 | 10000 | 0.8496 | 0.4618 | 0.6974 | 0.7722 | nan | 0.7921 | 0.7918 | 0.3186 | 0.8868 | 0.0 | 0.6924 | 0.5935 | 0.2372 | 0.7862 |
Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 18
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for TommyClas/50d_seg_model_20250507
Base model
nvidia/mit-b0