yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6495
- Map: 0.6577
- Map 50: 0.8888
- Map 75: 0.7196
- Map Small: -1.0
- Map Medium: 0.4565
- Map Large: 0.696
- Mar 1: 0.7167
- Mar 10: 0.8238
- Mar 100: 0.8571
- Mar Small: -1.0
- Mar Medium: 0.7
- Mar Large: 0.8833
- Map Raccoon: 0.0
- Mar 100 Raccoon: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Raccoon | Mar 100 Raccoon |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 40 | 1.7327 | 0.0125 | 0.0323 | 0.0087 | -1.0 | 0.0009 | 0.0269 | 0.1643 | 0.3119 | 0.5643 | -1.0 | 0.1333 | 0.6361 | 0.0 | 0.0 |
No log | 2.0 | 80 | 1.5556 | 0.0334 | 0.0889 | 0.0197 | -1.0 | 0.0013 | 0.0512 | 0.1619 | 0.4143 | 0.6024 | -1.0 | 0.15 | 0.6778 | 0.0 | 0.0 |
No log | 3.0 | 120 | 1.3242 | 0.0751 | 0.1717 | 0.0527 | -1.0 | 0.0321 | 0.086 | 0.2381 | 0.5 | 0.6619 | -1.0 | 0.1833 | 0.7417 | 0.0 | 0.0 |
No log | 4.0 | 160 | 1.3937 | 0.0627 | 0.1689 | 0.0273 | -1.0 | 0.018 | 0.0749 | 0.2024 | 0.4 | 0.6571 | -1.0 | 0.3667 | 0.7056 | 0.0 | 0.0 |
No log | 5.0 | 200 | 1.4487 | 0.0691 | 0.186 | 0.0318 | -1.0 | 0.0136 | 0.0819 | 0.1976 | 0.381 | 0.6619 | -1.0 | 0.3833 | 0.7083 | 0.0 | 0.0 |
No log | 6.0 | 240 | 1.6055 | 0.072 | 0.1773 | 0.0367 | -1.0 | 0.0166 | 0.0826 | 0.2738 | 0.3976 | 0.6 | -1.0 | 0.1167 | 0.6806 | 0.0 | 0.0 |
No log | 7.0 | 280 | 1.2369 | 0.0891 | 0.1749 | 0.0764 | -1.0 | 0.0147 | 0.1052 | 0.3286 | 0.5643 | 0.6762 | -1.0 | 0.3 | 0.7389 | 0.0 | 0.0 |
No log | 8.0 | 320 | 1.1697 | 0.1039 | 0.2156 | 0.0769 | -1.0 | 0.0276 | 0.1226 | 0.35 | 0.5929 | 0.7071 | -1.0 | 0.35 | 0.7667 | 0.0 | 0.0 |
No log | 9.0 | 360 | 1.3522 | 0.0799 | 0.1892 | 0.043 | -1.0 | 0.0075 | 0.0965 | 0.3333 | 0.531 | 0.6667 | -1.0 | 0.2167 | 0.7417 | 0.0 | 0.0 |
No log | 10.0 | 400 | 1.1073 | 0.1661 | 0.3327 | 0.1245 | -1.0 | 0.0721 | 0.1853 | 0.4 | 0.6333 | 0.7333 | -1.0 | 0.45 | 0.7806 | 0.0 | 0.0 |
No log | 11.0 | 440 | 1.2113 | 0.1826 | 0.2611 | 0.1954 | -1.0 | 0.0132 | 0.2154 | 0.5452 | 0.6667 | 0.7143 | -1.0 | 0.1833 | 0.8028 | 0.0 | 0.0 |
No log | 12.0 | 480 | 0.7511 | 0.181 | 0.2856 | 0.1872 | -1.0 | 0.0802 | 0.2093 | 0.5429 | 0.7429 | 0.8214 | -1.0 | 0.6167 | 0.8556 | 0.0 | 0.0 |
1.1152 | 13.0 | 520 | 0.7134 | 0.2717 | 0.3694 | 0.3298 | -1.0 | 0.3034 | 0.2829 | 0.6524 | 0.7976 | 0.8429 | -1.0 | 0.7 | 0.8667 | 0.0 | 0.0 |
1.1152 | 14.0 | 560 | 0.7378 | 0.2944 | 0.4283 | 0.3525 | -1.0 | 0.3236 | 0.3003 | 0.631 | 0.7857 | 0.8405 | -1.0 | 0.65 | 0.8722 | 0.0 | 0.0 |
1.1152 | 15.0 | 600 | 0.6910 | 0.3206 | 0.4532 | 0.3833 | -1.0 | 0.2139 | 0.3496 | 0.6595 | 0.8095 | 0.8571 | -1.0 | 0.7 | 0.8833 | 0.0 | 0.0 |
1.1152 | 16.0 | 640 | 0.7127 | 0.3638 | 0.5373 | 0.3835 | -1.0 | 0.207 | 0.399 | 0.6214 | 0.7905 | 0.8571 | -1.0 | 0.6833 | 0.8861 | 0.0 | 0.0 |
1.1152 | 17.0 | 680 | 0.7322 | 0.423 | 0.6133 | 0.487 | -1.0 | 0.303 | 0.4664 | 0.6667 | 0.7952 | 0.85 | -1.0 | 0.6833 | 0.8778 | 0.0 | 0.0 |
1.1152 | 18.0 | 720 | 0.6799 | 0.4933 | 0.6995 | 0.5327 | -1.0 | 0.4098 | 0.5267 | 0.7167 | 0.8381 | 0.8667 | -1.0 | 0.7333 | 0.8889 | 0.0 | 0.0 |
1.1152 | 19.0 | 760 | 0.7052 | 0.5861 | 0.806 | 0.6607 | -1.0 | 0.4365 | 0.6174 | 0.6857 | 0.8286 | 0.8595 | -1.0 | 0.6667 | 0.8917 | 0.0 | 0.0 |
1.1152 | 20.0 | 800 | 0.6941 | 0.5829 | 0.8159 | 0.6512 | -1.0 | 0.4048 | 0.6189 | 0.6881 | 0.819 | 0.8571 | -1.0 | 0.6833 | 0.8861 | 0.0 | 0.0 |
1.1152 | 21.0 | 840 | 0.7119 | 0.6005 | 0.8281 | 0.6929 | -1.0 | 0.3645 | 0.6466 | 0.7024 | 0.7905 | 0.8429 | -1.0 | 0.6833 | 0.8694 | 0.0 | 0.0 |
1.1152 | 22.0 | 880 | 0.6753 | 0.6023 | 0.8291 | 0.6487 | -1.0 | 0.4107 | 0.6406 | 0.7214 | 0.819 | 0.8619 | -1.0 | 0.7167 | 0.8861 | 0.0 | 0.0 |
1.1152 | 23.0 | 920 | 0.6418 | 0.6598 | 0.8868 | 0.7259 | -1.0 | 0.4322 | 0.703 | 0.7262 | 0.8167 | 0.8643 | -1.0 | 0.7 | 0.8917 | 0.0 | 0.0 |
1.1152 | 24.0 | 960 | 0.6646 | 0.6521 | 0.8725 | 0.7208 | -1.0 | 0.4154 | 0.6961 | 0.7095 | 0.8262 | 0.869 | -1.0 | 0.7 | 0.8972 | 0.0 | 0.0 |
0.661 | 25.0 | 1000 | 0.6629 | 0.6552 | 0.8923 | 0.7389 | -1.0 | 0.4398 | 0.6956 | 0.7262 | 0.8167 | 0.8548 | -1.0 | 0.6667 | 0.8861 | 0.0 | 0.0 |
0.661 | 26.0 | 1040 | 0.6507 | 0.6501 | 0.8893 | 0.7147 | -1.0 | 0.4404 | 0.6893 | 0.7167 | 0.8119 | 0.8524 | -1.0 | 0.6833 | 0.8806 | 0.0 | 0.0 |
0.661 | 27.0 | 1080 | 0.6527 | 0.6502 | 0.8883 | 0.7192 | -1.0 | 0.4258 | 0.6913 | 0.7167 | 0.831 | 0.8548 | -1.0 | 0.6833 | 0.8833 | 0.0 | 0.0 |
0.661 | 28.0 | 1120 | 0.6488 | 0.6592 | 0.8861 | 0.7451 | -1.0 | 0.447 | 0.697 | 0.719 | 0.8262 | 0.8571 | -1.0 | 0.7 | 0.8833 | 0.0 | 0.0 |
0.661 | 29.0 | 1160 | 0.6496 | 0.6578 | 0.8885 | 0.7198 | -1.0 | 0.4565 | 0.6961 | 0.7167 | 0.8238 | 0.8571 | -1.0 | 0.7 | 0.8833 | 0.0 | 0.0 |
0.661 | 30.0 | 1200 | 0.6495 | 0.6577 | 0.8888 | 0.7196 | -1.0 | 0.4565 | 0.696 | 0.7167 | 0.8238 | 0.8571 | -1.0 | 0.7 | 0.8833 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for gubringa/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny