yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8075
- Map: 0.5492
- Map 50: 0.8129
- Map 75: 0.6184
- Map Small: -1.0
- Map Medium: 0.5412
- Map Large: 0.5745
- Mar 1: 0.4367
- Mar 10: 0.7285
- Mar 100: 0.7829
- Mar Small: -1.0
- Mar Medium: 0.7643
- Mar Large: 0.7895
- Map Banana: 0.4035
- Mar 100 Banana: 0.73
- Map Orange: 0.5513
- Mar 100 Orange: 0.7929
- Map Apple: 0.6929
- Mar 100 Apple: 0.8257
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 60 | 2.1930 | 0.0025 | 0.0072 | 0.0016 | -1.0 | 0.0007 | 0.0035 | 0.0161 | 0.0825 | 0.2242 | -1.0 | 0.0429 | 0.2487 | 0.0015 | 0.2925 | 0.0 | 0.0 | 0.0061 | 0.38 |
No log | 2.0 | 120 | 1.9326 | 0.011 | 0.0299 | 0.0064 | -1.0 | 0.0044 | 0.0134 | 0.0758 | 0.2107 | 0.3813 | -1.0 | 0.1214 | 0.416 | 0.0137 | 0.45 | 0.0062 | 0.1738 | 0.0131 | 0.52 |
No log | 3.0 | 180 | 1.6307 | 0.0352 | 0.0947 | 0.0195 | -1.0 | 0.0531 | 0.0355 | 0.1342 | 0.303 | 0.503 | -1.0 | 0.3214 | 0.5277 | 0.0413 | 0.5375 | 0.0424 | 0.3714 | 0.0218 | 0.6 |
No log | 4.0 | 240 | 1.6542 | 0.0558 | 0.1344 | 0.0522 | -1.0 | 0.1515 | 0.0482 | 0.0944 | 0.2671 | 0.4604 | -1.0 | 0.35 | 0.4725 | 0.0524 | 0.5075 | 0.0662 | 0.3881 | 0.0487 | 0.4857 |
No log | 5.0 | 300 | 1.6691 | 0.0388 | 0.1063 | 0.0274 | -1.0 | 0.0944 | 0.0364 | 0.1583 | 0.2932 | 0.4751 | -1.0 | 0.35 | 0.4917 | 0.043 | 0.5 | 0.0359 | 0.3452 | 0.0374 | 0.58 |
No log | 6.0 | 360 | 1.1086 | 0.0826 | 0.1345 | 0.0841 | -1.0 | 0.2117 | 0.0861 | 0.2797 | 0.4782 | 0.7029 | -1.0 | 0.5929 | 0.7177 | 0.071 | 0.7225 | 0.1168 | 0.6262 | 0.06 | 0.76 |
No log | 7.0 | 420 | 1.1675 | 0.0814 | 0.165 | 0.072 | -1.0 | 0.2427 | 0.086 | 0.2683 | 0.4722 | 0.6522 | -1.0 | 0.4929 | 0.6746 | 0.0837 | 0.6575 | 0.0851 | 0.5619 | 0.0754 | 0.7371 |
No log | 8.0 | 480 | 1.0365 | 0.1206 | 0.207 | 0.1248 | -1.0 | 0.2508 | 0.1171 | 0.3042 | 0.5348 | 0.7123 | -1.0 | 0.6143 | 0.7282 | 0.0767 | 0.6875 | 0.1441 | 0.681 | 0.1411 | 0.7686 |
1.512 | 9.0 | 540 | 1.0794 | 0.1506 | 0.2487 | 0.1703 | -1.0 | 0.2685 | 0.1558 | 0.3506 | 0.5771 | 0.6842 | -1.0 | 0.4857 | 0.7162 | 0.0875 | 0.645 | 0.1822 | 0.6619 | 0.1823 | 0.7457 |
1.512 | 10.0 | 600 | 0.9685 | 0.2052 | 0.3178 | 0.2417 | -1.0 | 0.3075 | 0.2088 | 0.3638 | 0.5795 | 0.713 | -1.0 | 0.5571 | 0.7386 | 0.1142 | 0.66 | 0.2011 | 0.6619 | 0.3002 | 0.8171 |
1.512 | 11.0 | 660 | 1.0193 | 0.2702 | 0.4348 | 0.3242 | -1.0 | 0.3423 | 0.287 | 0.3652 | 0.6083 | 0.6889 | -1.0 | 0.6429 | 0.699 | 0.1441 | 0.64 | 0.263 | 0.6952 | 0.4036 | 0.7314 |
1.512 | 12.0 | 720 | 0.9402 | 0.3339 | 0.5175 | 0.3808 | -1.0 | 0.358 | 0.3523 | 0.3898 | 0.637 | 0.7244 | -1.0 | 0.6286 | 0.7421 | 0.2116 | 0.67 | 0.3413 | 0.7262 | 0.4489 | 0.7771 |
1.512 | 13.0 | 780 | 0.9065 | 0.4067 | 0.6265 | 0.4574 | -1.0 | 0.5061 | 0.4159 | 0.3831 | 0.6531 | 0.7409 | -1.0 | 0.6286 | 0.76 | 0.2899 | 0.705 | 0.3526 | 0.7262 | 0.5776 | 0.7914 |
1.512 | 14.0 | 840 | 0.8992 | 0.4333 | 0.6571 | 0.4951 | -1.0 | 0.5391 | 0.4405 | 0.3823 | 0.679 | 0.7469 | -1.0 | 0.6929 | 0.7585 | 0.2879 | 0.6975 | 0.4142 | 0.7405 | 0.5978 | 0.8029 |
1.512 | 15.0 | 900 | 0.9158 | 0.4523 | 0.6711 | 0.5006 | -1.0 | 0.567 | 0.457 | 0.3885 | 0.6792 | 0.7503 | -1.0 | 0.7429 | 0.7554 | 0.3111 | 0.6875 | 0.4015 | 0.7548 | 0.6444 | 0.8086 |
1.512 | 16.0 | 960 | 0.8610 | 0.4903 | 0.7499 | 0.5371 | -1.0 | 0.5782 | 0.4965 | 0.4083 | 0.6934 | 0.7603 | -1.0 | 0.75 | 0.7656 | 0.3468 | 0.6975 | 0.4646 | 0.769 | 0.6594 | 0.8143 |
0.8176 | 17.0 | 1020 | 0.8541 | 0.5026 | 0.7497 | 0.5756 | -1.0 | 0.6024 | 0.509 | 0.4079 | 0.7004 | 0.7741 | -1.0 | 0.7429 | 0.783 | 0.363 | 0.7175 | 0.5092 | 0.7905 | 0.6356 | 0.8143 |
0.8176 | 18.0 | 1080 | 0.8627 | 0.4944 | 0.7614 | 0.5615 | -1.0 | 0.58 | 0.5081 | 0.4067 | 0.6975 | 0.7571 | -1.0 | 0.7 | 0.7686 | 0.3636 | 0.715 | 0.501 | 0.7476 | 0.6185 | 0.8086 |
0.8176 | 19.0 | 1140 | 0.8270 | 0.5227 | 0.7928 | 0.5967 | -1.0 | 0.589 | 0.5339 | 0.4137 | 0.7212 | 0.7767 | -1.0 | 0.7143 | 0.789 | 0.3864 | 0.735 | 0.5444 | 0.781 | 0.6372 | 0.8143 |
0.8176 | 20.0 | 1200 | 0.8100 | 0.5428 | 0.807 | 0.629 | -1.0 | 0.5925 | 0.561 | 0.4291 | 0.7177 | 0.7721 | -1.0 | 0.7571 | 0.7787 | 0.4188 | 0.7125 | 0.553 | 0.7952 | 0.6567 | 0.8086 |
0.8176 | 21.0 | 1260 | 0.8255 | 0.5424 | 0.8012 | 0.6145 | -1.0 | 0.5723 | 0.5611 | 0.4269 | 0.7175 | 0.7674 | -1.0 | 0.7286 | 0.7775 | 0.3995 | 0.7075 | 0.5572 | 0.7833 | 0.6703 | 0.8114 |
0.8176 | 22.0 | 1320 | 0.8203 | 0.5447 | 0.8214 | 0.6081 | -1.0 | 0.5544 | 0.567 | 0.4308 | 0.7186 | 0.7785 | -1.0 | 0.75 | 0.7863 | 0.3999 | 0.7275 | 0.5527 | 0.7881 | 0.6815 | 0.82 |
0.8176 | 23.0 | 1380 | 0.8116 | 0.555 | 0.8196 | 0.6291 | -1.0 | 0.5953 | 0.569 | 0.4345 | 0.7297 | 0.7793 | -1.0 | 0.75 | 0.7874 | 0.4045 | 0.725 | 0.5768 | 0.7929 | 0.6836 | 0.82 |
0.8176 | 24.0 | 1440 | 0.8178 | 0.5431 | 0.7922 | 0.6252 | -1.0 | 0.5631 | 0.5629 | 0.4217 | 0.7217 | 0.7755 | -1.0 | 0.7357 | 0.7849 | 0.3973 | 0.7275 | 0.5501 | 0.7905 | 0.6819 | 0.8086 |
0.6165 | 25.0 | 1500 | 0.8056 | 0.5533 | 0.8126 | 0.6213 | -1.0 | 0.569 | 0.5718 | 0.43 | 0.7249 | 0.779 | -1.0 | 0.75 | 0.787 | 0.4049 | 0.7275 | 0.5546 | 0.781 | 0.7002 | 0.8286 |
0.6165 | 26.0 | 1560 | 0.7900 | 0.556 | 0.8143 | 0.6332 | -1.0 | 0.552 | 0.5828 | 0.4417 | 0.7364 | 0.7812 | -1.0 | 0.7643 | 0.7873 | 0.418 | 0.7325 | 0.55 | 0.7881 | 0.7002 | 0.8229 |
0.6165 | 27.0 | 1620 | 0.8072 | 0.5466 | 0.8125 | 0.6105 | -1.0 | 0.5431 | 0.5733 | 0.4367 | 0.7327 | 0.7787 | -1.0 | 0.7571 | 0.786 | 0.4051 | 0.725 | 0.5465 | 0.7881 | 0.688 | 0.8229 |
0.6165 | 28.0 | 1680 | 0.8077 | 0.5481 | 0.8135 | 0.6199 | -1.0 | 0.5418 | 0.5725 | 0.4351 | 0.7263 | 0.783 | -1.0 | 0.7643 | 0.7899 | 0.4018 | 0.7275 | 0.5497 | 0.7929 | 0.6927 | 0.8286 |
0.6165 | 29.0 | 1740 | 0.8085 | 0.5488 | 0.812 | 0.618 | -1.0 | 0.541 | 0.574 | 0.4367 | 0.7277 | 0.7837 | -1.0 | 0.7643 | 0.7905 | 0.4021 | 0.73 | 0.5514 | 0.7952 | 0.6929 | 0.8257 |
0.6165 | 30.0 | 1800 | 0.8075 | 0.5492 | 0.8129 | 0.6184 | -1.0 | 0.5412 | 0.5745 | 0.4367 | 0.7285 | 0.7829 | -1.0 | 0.7643 | 0.7895 | 0.4035 | 0.73 | 0.5513 | 0.7929 | 0.6929 | 0.8257 |
Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.1
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Igmata/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny