--- library_name: transformers license: apache-2.0 base_model: hustvl/yolos-tiny tags: - generated_from_trainer model-index: - name: yolo_finetuned_fruits results: [] --- # yolo_finetuned_fruits This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7819 - Map: 0.5883 - Map 50: 0.8521 - Map 75: 0.6633 - Map Small: -1.0 - Map Medium: 0.4917 - Map Large: 0.6223 - Mar 1: 0.4441 - Mar 10: 0.7224 - Mar 100: 0.7722 - Mar Small: -1.0 - Mar Medium: 0.6417 - Mar Large: 0.7892 - Map Banana: 0.4472 - Mar 100 Banana: 0.7275 - Map Orange: 0.6126 - Mar 100 Orange: 0.7833 - Map Apple: 0.7051 - Mar 100 Apple: 0.8057 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:| | No log | 1.0 | 60 | 1.9158 | 0.0195 | 0.0636 | 0.0063 | -1.0 | 0.027 | 0.0198 | 0.0473 | 0.1897 | 0.3615 | -1.0 | 0.2183 | 0.3761 | 0.04 | 0.385 | 0.0011 | 0.0881 | 0.0174 | 0.6114 | | No log | 2.0 | 120 | 2.0371 | 0.0268 | 0.078 | 0.012 | -1.0 | 0.0374 | 0.0262 | 0.0794 | 0.2079 | 0.3617 | -1.0 | 0.2 | 0.3771 | 0.039 | 0.4075 | 0.0053 | 0.1119 | 0.0361 | 0.5657 | | No log | 3.0 | 180 | 1.4488 | 0.0432 | 0.1226 | 0.0229 | -1.0 | 0.0661 | 0.0401 | 0.1905 | 0.3398 | 0.5017 | -1.0 | 0.2583 | 0.5219 | 0.069 | 0.625 | 0.0333 | 0.2714 | 0.0272 | 0.6086 | | No log | 4.0 | 240 | 1.2716 | 0.0733 | 0.1622 | 0.0681 | -1.0 | 0.1318 | 0.0715 | 0.242 | 0.4183 | 0.6169 | -1.0 | 0.3967 | 0.6399 | 0.0732 | 0.6425 | 0.0524 | 0.4452 | 0.0943 | 0.7629 | | No log | 5.0 | 300 | 1.1472 | 0.1133 | 0.2136 | 0.1094 | -1.0 | 0.1777 | 0.1156 | 0.2851 | 0.48 | 0.6252 | -1.0 | 0.405 | 0.6456 | 0.1031 | 0.7 | 0.1103 | 0.4071 | 0.1265 | 0.7686 | | No log | 6.0 | 360 | 1.1300 | 0.1191 | 0.2449 | 0.1125 | -1.0 | 0.1469 | 0.1307 | 0.2699 | 0.4715 | 0.6694 | -1.0 | 0.4333 | 0.6956 | 0.0991 | 0.67 | 0.1202 | 0.6095 | 0.1379 | 0.7286 | | No log | 7.0 | 420 | 1.0298 | 0.1813 | 0.3059 | 0.1929 | -1.0 | 0.2227 | 0.2027 | 0.3479 | 0.5318 | 0.6733 | -1.0 | 0.45 | 0.6983 | 0.1167 | 0.67 | 0.2302 | 0.55 | 0.197 | 0.8 | | No log | 8.0 | 480 | 1.0293 | 0.2451 | 0.4266 | 0.2668 | -1.0 | 0.2595 | 0.2725 | 0.3298 | 0.5753 | 0.7069 | -1.0 | 0.5267 | 0.7266 | 0.1943 | 0.7125 | 0.3182 | 0.6738 | 0.2229 | 0.7343 | | 1.2845 | 9.0 | 540 | 1.0215 | 0.3761 | 0.6485 | 0.3915 | -1.0 | 0.4188 | 0.3996 | 0.3375 | 0.6398 | 0.7336 | -1.0 | 0.6383 | 0.7473 | 0.2155 | 0.6775 | 0.428 | 0.7405 | 0.4848 | 0.7829 | | 1.2845 | 10.0 | 600 | 0.9666 | 0.4652 | 0.7264 | 0.5036 | -1.0 | 0.3976 | 0.497 | 0.3696 | 0.6634 | 0.7315 | -1.0 | 0.5817 | 0.7512 | 0.3139 | 0.6775 | 0.4658 | 0.7286 | 0.616 | 0.7886 | | 1.2845 | 11.0 | 660 | 0.9365 | 0.4826 | 0.7627 | 0.5587 | -1.0 | 0.4124 | 0.5147 | 0.3787 | 0.6606 | 0.7163 | -1.0 | 0.505 | 0.7414 | 0.3238 | 0.6875 | 0.4915 | 0.7071 | 0.6327 | 0.7543 | | 1.2845 | 12.0 | 720 | 0.9472 | 0.4644 | 0.7652 | 0.5261 | -1.0 | 0.3875 | 0.5056 | 0.3719 | 0.6594 | 0.7294 | -1.0 | 0.5717 | 0.7484 | 0.3286 | 0.7025 | 0.4983 | 0.7143 | 0.5663 | 0.7714 | | 1.2845 | 13.0 | 780 | 0.9087 | 0.4966 | 0.764 | 0.5557 | -1.0 | 0.4804 | 0.5252 | 0.3921 | 0.679 | 0.7517 | -1.0 | 0.6483 | 0.7656 | 0.3481 | 0.71 | 0.5176 | 0.7595 | 0.624 | 0.7857 | | 1.2845 | 14.0 | 840 | 0.8610 | 0.5198 | 0.7753 | 0.5692 | -1.0 | 0.4833 | 0.5606 | 0.4232 | 0.7004 | 0.7477 | -1.0 | 0.6633 | 0.7606 | 0.409 | 0.685 | 0.5204 | 0.7667 | 0.63 | 0.7914 | | 1.2845 | 15.0 | 900 | 0.8564 | 0.5518 | 0.8086 | 0.6727 | -1.0 | 0.5648 | 0.5817 | 0.411 | 0.6983 | 0.7569 | -1.0 | 0.645 | 0.7717 | 0.4321 | 0.715 | 0.5533 | 0.7643 | 0.6701 | 0.7914 | | 1.2845 | 16.0 | 960 | 0.8996 | 0.5348 | 0.8088 | 0.6341 | -1.0 | 0.4901 | 0.5621 | 0.4183 | 0.6793 | 0.745 | -1.0 | 0.6383 | 0.7595 | 0.4119 | 0.6975 | 0.5284 | 0.7405 | 0.6642 | 0.7971 | | 0.8009 | 17.0 | 1020 | 0.8437 | 0.5527 | 0.8203 | 0.6544 | -1.0 | 0.4749 | 0.5871 | 0.4243 | 0.6989 | 0.7535 | -1.0 | 0.6067 | 0.7722 | 0.4025 | 0.71 | 0.5727 | 0.7476 | 0.683 | 0.8029 | | 0.8009 | 18.0 | 1080 | 0.8433 | 0.5625 | 0.8238 | 0.6682 | -1.0 | 0.4952 | 0.5982 | 0.4334 | 0.6974 | 0.7577 | -1.0 | 0.5983 | 0.7777 | 0.407 | 0.7175 | 0.5929 | 0.7643 | 0.6876 | 0.7914 | | 0.8009 | 19.0 | 1140 | 0.8158 | 0.5855 | 0.8359 | 0.6588 | -1.0 | 0.5315 | 0.614 | 0.4387 | 0.7157 | 0.7715 | -1.0 | 0.6267 | 0.7896 | 0.4249 | 0.735 | 0.6071 | 0.7738 | 0.7245 | 0.8057 | | 0.8009 | 20.0 | 1200 | 0.7977 | 0.586 | 0.8433 | 0.6602 | -1.0 | 0.5306 | 0.6157 | 0.4415 | 0.7192 | 0.7753 | -1.0 | 0.6433 | 0.7929 | 0.4119 | 0.7225 | 0.6322 | 0.7833 | 0.7138 | 0.82 | | 0.8009 | 21.0 | 1260 | 0.8195 | 0.5916 | 0.8465 | 0.6581 | -1.0 | 0.5731 | 0.6166 | 0.442 | 0.7196 | 0.7795 | -1.0 | 0.6733 | 0.7941 | 0.4367 | 0.73 | 0.616 | 0.7857 | 0.7222 | 0.8229 | | 0.8009 | 22.0 | 1320 | 0.7861 | 0.5915 | 0.8481 | 0.6645 | -1.0 | 0.5399 | 0.619 | 0.4396 | 0.7219 | 0.7785 | -1.0 | 0.6583 | 0.7943 | 0.4391 | 0.735 | 0.6303 | 0.7976 | 0.7052 | 0.8029 | | 0.8009 | 23.0 | 1380 | 0.8101 | 0.5835 | 0.848 | 0.6618 | -1.0 | 0.5154 | 0.6151 | 0.4409 | 0.7128 | 0.7804 | -1.0 | 0.64 | 0.7979 | 0.4427 | 0.7475 | 0.614 | 0.7881 | 0.6937 | 0.8057 | | 0.8009 | 24.0 | 1440 | 0.7936 | 0.5912 | 0.8462 | 0.6779 | -1.0 | 0.5577 | 0.6196 | 0.4438 | 0.7217 | 0.7819 | -1.0 | 0.6733 | 0.7962 | 0.4422 | 0.7425 | 0.6164 | 0.7833 | 0.715 | 0.82 | | 0.5971 | 25.0 | 1500 | 0.7935 | 0.5879 | 0.8557 | 0.6645 | -1.0 | 0.4766 | 0.6217 | 0.441 | 0.7235 | 0.7775 | -1.0 | 0.6683 | 0.7921 | 0.4412 | 0.7325 | 0.6318 | 0.8 | 0.6907 | 0.8 | | 0.5971 | 26.0 | 1560 | 0.7936 | 0.5867 | 0.854 | 0.6559 | -1.0 | 0.4773 | 0.6209 | 0.4406 | 0.719 | 0.7754 | -1.0 | 0.6417 | 0.7922 | 0.4459 | 0.74 | 0.6115 | 0.7833 | 0.7028 | 0.8029 | | 0.5971 | 27.0 | 1620 | 0.7856 | 0.5904 | 0.8561 | 0.6682 | -1.0 | 0.5188 | 0.6238 | 0.4441 | 0.7217 | 0.7748 | -1.0 | 0.6417 | 0.7919 | 0.4463 | 0.7325 | 0.6143 | 0.7833 | 0.7105 | 0.8086 | | 0.5971 | 28.0 | 1680 | 0.7838 | 0.5918 | 0.8561 | 0.6678 | -1.0 | 0.4937 | 0.6265 | 0.4448 | 0.7231 | 0.7746 | -1.0 | 0.6417 | 0.7919 | 0.4458 | 0.7275 | 0.6251 | 0.7905 | 0.7047 | 0.8057 | | 0.5971 | 29.0 | 1740 | 0.7819 | 0.592 | 0.8569 | 0.6674 | -1.0 | 0.4967 | 0.6266 | 0.4457 | 0.724 | 0.7738 | -1.0 | 0.6417 | 0.791 | 0.4476 | 0.7275 | 0.6235 | 0.7881 | 0.7047 | 0.8057 | | 0.5971 | 30.0 | 1800 | 0.7819 | 0.5883 | 0.8521 | 0.6633 | -1.0 | 0.4917 | 0.6223 | 0.4441 | 0.7224 | 0.7722 | -1.0 | 0.6417 | 0.7892 | 0.4472 | 0.7275 | 0.6126 | 0.7833 | 0.7051 | 0.8057 | ### Framework versions - Transformers 4.50.3 - Pytorch 2.6.0+cu124 - Datasets 3.5.0 - Tokenizers 0.21.1