detr_finetuned_kitti_mots-bright
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6703
- Map: 0.4803
- Map 50: 0.7799
- Map 75: 0.494
- Map Small: 0.2239
- Map Medium: 0.4794
- Map Large: 0.7442
- Mar 1: 0.1586
- Mar 10: 0.5224
- Mar 100: 0.6196
- Mar Small: 0.4422
- Mar Medium: 0.6281
- Mar Large: 0.8123
- Map Car: 0.6182
- Mar 100 Car: 0.7081
- Map Pedestrian: 0.3423
- Mar 100 Pedestrian: 0.5311
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Car | Mar 100 Car | Map Pedestrian | Mar 100 Pedestrian |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.9436 | 1.0 | 743 | 0.8850 | 0.3518 | 0.6491 | 0.3381 | 0.1025 | 0.3437 | 0.6351 | 0.1336 | 0.4063 | 0.5187 | 0.3121 | 0.5288 | 0.7428 | 0.4742 | 0.5947 | 0.2293 | 0.4428 |
0.8864 | 2.0 | 1486 | 0.8514 | 0.3714 | 0.6685 | 0.3602 | 0.1196 | 0.3636 | 0.6536 | 0.1384 | 0.4226 | 0.5403 | 0.3347 | 0.5536 | 0.7549 | 0.5002 | 0.6203 | 0.2426 | 0.4603 |
0.8577 | 3.0 | 2229 | 0.8346 | 0.3815 | 0.6809 | 0.3741 | 0.1251 | 0.3727 | 0.6703 | 0.1428 | 0.4299 | 0.5424 | 0.3401 | 0.5521 | 0.7616 | 0.5075 | 0.6206 | 0.2555 | 0.4642 |
0.8454 | 4.0 | 2972 | 0.8225 | 0.3872 | 0.6848 | 0.3845 | 0.1313 | 0.3804 | 0.671 | 0.142 | 0.4376 | 0.5496 | 0.3441 | 0.5631 | 0.763 | 0.5173 | 0.6291 | 0.2571 | 0.47 |
0.8258 | 5.0 | 3715 | 0.8161 | 0.3929 | 0.6901 | 0.3893 | 0.1383 | 0.3859 | 0.6733 | 0.1434 | 0.4402 | 0.5486 | 0.3469 | 0.561 | 0.7605 | 0.5246 | 0.6332 | 0.2613 | 0.464 |
0.8186 | 6.0 | 4458 | 0.8088 | 0.3963 | 0.6967 | 0.3967 | 0.1404 | 0.3882 | 0.6784 | 0.1441 | 0.4437 | 0.5516 | 0.3549 | 0.5603 | 0.7673 | 0.5232 | 0.6351 | 0.2695 | 0.4681 |
0.8058 | 7.0 | 5201 | 0.7959 | 0.4054 | 0.7096 | 0.4078 | 0.1448 | 0.4036 | 0.6818 | 0.1444 | 0.4565 | 0.5626 | 0.3653 | 0.5749 | 0.7706 | 0.5338 | 0.6433 | 0.2771 | 0.4819 |
0.7957 | 8.0 | 5944 | 0.7860 | 0.4115 | 0.7072 | 0.4119 | 0.1455 | 0.4088 | 0.6917 | 0.1469 | 0.4561 | 0.5666 | 0.3603 | 0.5817 | 0.7772 | 0.543 | 0.6507 | 0.28 | 0.4825 |
0.7842 | 9.0 | 6687 | 0.7766 | 0.4155 | 0.7136 | 0.4165 | 0.1508 | 0.4105 | 0.6974 | 0.1479 | 0.459 | 0.5711 | 0.3734 | 0.5819 | 0.7826 | 0.547 | 0.6543 | 0.2841 | 0.4879 |
0.7814 | 10.0 | 7430 | 0.7803 | 0.4128 | 0.7122 | 0.4133 | 0.1481 | 0.41 | 0.6995 | 0.1467 | 0.4578 | 0.5646 | 0.3611 | 0.577 | 0.7794 | 0.5464 | 0.6516 | 0.2793 | 0.4776 |
0.771 | 11.0 | 8173 | 0.7707 | 0.4145 | 0.7134 | 0.4199 | 0.1524 | 0.4116 | 0.696 | 0.1459 | 0.4624 | 0.5691 | 0.3691 | 0.5823 | 0.7773 | 0.5474 | 0.6548 | 0.2816 | 0.4834 |
0.7674 | 12.0 | 8916 | 0.7643 | 0.4228 | 0.7207 | 0.4258 | 0.1581 | 0.4223 | 0.6966 | 0.1477 | 0.4668 | 0.5762 | 0.3753 | 0.5912 | 0.7799 | 0.5566 | 0.6607 | 0.289 | 0.4916 |
0.7584 | 13.0 | 9659 | 0.7559 | 0.4246 | 0.7265 | 0.4316 | 0.1629 | 0.4213 | 0.7056 | 0.1472 | 0.4698 | 0.5772 | 0.3817 | 0.5872 | 0.7875 | 0.5574 | 0.6611 | 0.2919 | 0.4933 |
0.7516 | 14.0 | 10402 | 0.7555 | 0.4269 | 0.7277 | 0.4334 | 0.1615 | 0.4246 | 0.7072 | 0.1481 | 0.4716 | 0.5787 | 0.3765 | 0.5922 | 0.7879 | 0.5619 | 0.6641 | 0.2919 | 0.4933 |
0.7456 | 15.0 | 11145 | 0.7448 | 0.4328 | 0.7333 | 0.4432 | 0.1679 | 0.4316 | 0.7122 | 0.1507 | 0.4785 | 0.5819 | 0.3883 | 0.5916 | 0.7915 | 0.5686 | 0.6714 | 0.297 | 0.4923 |
0.7404 | 16.0 | 11888 | 0.7475 | 0.4336 | 0.7359 | 0.4363 | 0.1668 | 0.4304 | 0.7181 | 0.1501 | 0.4798 | 0.579 | 0.3815 | 0.5885 | 0.7936 | 0.5668 | 0.6671 | 0.3005 | 0.4909 |
0.7404 | 17.0 | 12631 | 0.7393 | 0.4361 | 0.7386 | 0.4473 | 0.1752 | 0.4337 | 0.7128 | 0.1509 | 0.4821 | 0.583 | 0.3997 | 0.5905 | 0.787 | 0.5745 | 0.6745 | 0.2978 | 0.4915 |
0.7279 | 18.0 | 13374 | 0.7331 | 0.4384 | 0.7406 | 0.4417 | 0.1771 | 0.4376 | 0.7105 | 0.1508 | 0.4833 | 0.5879 | 0.3965 | 0.5984 | 0.7926 | 0.578 | 0.6775 | 0.2988 | 0.4982 |
0.722 | 19.0 | 14117 | 0.7329 | 0.4407 | 0.742 | 0.4443 | 0.1834 | 0.4393 | 0.7138 | 0.1505 | 0.4866 | 0.5898 | 0.4042 | 0.5995 | 0.7887 | 0.5748 | 0.6751 | 0.3067 | 0.5044 |
0.7184 | 20.0 | 14860 | 0.7240 | 0.4484 | 0.748 | 0.4557 | 0.1872 | 0.4476 | 0.7186 | 0.153 | 0.4915 | 0.5943 | 0.4045 | 0.6062 | 0.7926 | 0.5846 | 0.6819 | 0.3121 | 0.5066 |
0.7177 | 21.0 | 15603 | 0.7266 | 0.4447 | 0.75 | 0.4517 | 0.1856 | 0.4419 | 0.7154 | 0.1515 | 0.4883 | 0.5893 | 0.4061 | 0.5988 | 0.7866 | 0.58 | 0.6777 | 0.3095 | 0.5009 |
0.7077 | 22.0 | 16346 | 0.7172 | 0.4496 | 0.752 | 0.4618 | 0.1861 | 0.4486 | 0.7199 | 0.1524 | 0.4921 | 0.5935 | 0.4065 | 0.6031 | 0.7946 | 0.5856 | 0.6812 | 0.3137 | 0.5057 |
0.7073 | 23.0 | 17089 | 0.7199 | 0.4471 | 0.7489 | 0.4598 | 0.1882 | 0.4443 | 0.7203 | 0.1518 | 0.4898 | 0.5944 | 0.4094 | 0.6039 | 0.7936 | 0.5819 | 0.6807 | 0.3123 | 0.5081 |
0.7043 | 24.0 | 17832 | 0.7139 | 0.4525 | 0.7506 | 0.4618 | 0.1893 | 0.4508 | 0.7258 | 0.1542 | 0.4964 | 0.5994 | 0.4122 | 0.6084 | 0.8026 | 0.589 | 0.6827 | 0.316 | 0.516 |
0.6988 | 25.0 | 18575 | 0.7132 | 0.4527 | 0.7543 | 0.4627 | 0.19 | 0.4498 | 0.7296 | 0.1538 | 0.4957 | 0.5967 | 0.4039 | 0.6064 | 0.805 | 0.591 | 0.6854 | 0.3144 | 0.5081 |
0.6957 | 26.0 | 19318 | 0.7092 | 0.4545 | 0.7561 | 0.4626 | 0.1934 | 0.4516 | 0.7304 | 0.1539 | 0.4973 | 0.5984 | 0.4111 | 0.6069 | 0.8027 | 0.5887 | 0.6838 | 0.3203 | 0.513 |
0.6864 | 27.0 | 20061 | 0.7065 | 0.4559 | 0.7552 | 0.4667 | 0.1973 | 0.4536 | 0.7279 | 0.1542 | 0.4987 | 0.5998 | 0.4103 | 0.6117 | 0.7982 | 0.5941 | 0.6895 | 0.3178 | 0.5101 |
0.684 | 28.0 | 20804 | 0.7045 | 0.458 | 0.7582 | 0.4746 | 0.1966 | 0.4572 | 0.7311 | 0.1545 | 0.4997 | 0.6022 | 0.415 | 0.6116 | 0.8053 | 0.594 | 0.6893 | 0.322 | 0.5152 |
0.681 | 29.0 | 21547 | 0.7040 | 0.4574 | 0.7603 | 0.4715 | 0.1971 | 0.4563 | 0.7296 | 0.1536 | 0.4988 | 0.5987 | 0.4136 | 0.6073 | 0.8004 | 0.591 | 0.6872 | 0.3239 | 0.5102 |
0.6769 | 30.0 | 22290 | 0.7023 | 0.4585 | 0.7613 | 0.4703 | 0.2004 | 0.4565 | 0.7335 | 0.1539 | 0.5012 | 0.6019 | 0.4214 | 0.6084 | 0.8038 | 0.5922 | 0.6902 | 0.3247 | 0.5136 |
0.6774 | 31.0 | 23033 | 0.6974 | 0.4607 | 0.7646 | 0.4775 | 0.2032 | 0.4594 | 0.7304 | 0.1543 | 0.502 | 0.6048 | 0.4317 | 0.6094 | 0.8032 | 0.5963 | 0.6924 | 0.3251 | 0.5173 |
0.6678 | 32.0 | 23776 | 0.6914 | 0.4654 | 0.7623 | 0.4756 | 0.2076 | 0.4642 | 0.7337 | 0.1559 | 0.5067 | 0.6088 | 0.4287 | 0.6175 | 0.8047 | 0.6021 | 0.6976 | 0.3287 | 0.5201 |
0.6733 | 33.0 | 24519 | 0.6896 | 0.4664 | 0.767 | 0.4805 | 0.212 | 0.4653 | 0.7326 | 0.1552 | 0.5086 | 0.6078 | 0.4246 | 0.6166 | 0.8067 | 0.6038 | 0.6979 | 0.329 | 0.5177 |
0.6656 | 34.0 | 25262 | 0.6878 | 0.4687 | 0.769 | 0.4857 | 0.2133 | 0.4682 | 0.7353 | 0.1558 | 0.5112 | 0.6078 | 0.4241 | 0.6173 | 0.8055 | 0.6048 | 0.6975 | 0.3326 | 0.5181 |
0.6599 | 35.0 | 26005 | 0.6848 | 0.4716 | 0.7718 | 0.492 | 0.2121 | 0.4717 | 0.7364 | 0.156 | 0.5135 | 0.6121 | 0.4292 | 0.6218 | 0.8082 | 0.6081 | 0.7002 | 0.3351 | 0.524 |
0.6646 | 36.0 | 26748 | 0.6857 | 0.4709 | 0.7721 | 0.487 | 0.2129 | 0.4711 | 0.7369 | 0.1565 | 0.5137 | 0.6109 | 0.4316 | 0.6184 | 0.8092 | 0.6073 | 0.7001 | 0.3344 | 0.5217 |
0.6568 | 37.0 | 27491 | 0.6867 | 0.4707 | 0.7729 | 0.4843 | 0.2147 | 0.4694 | 0.7393 | 0.1564 | 0.5117 | 0.6102 | 0.4252 | 0.6195 | 0.8094 | 0.6065 | 0.6985 | 0.3349 | 0.5219 |
0.6493 | 38.0 | 28234 | 0.6830 | 0.4713 | 0.771 | 0.4835 | 0.2121 | 0.4734 | 0.7357 | 0.1573 | 0.5131 | 0.6118 | 0.4277 | 0.622 | 0.8083 | 0.6081 | 0.7002 | 0.3345 | 0.5234 |
0.6567 | 39.0 | 28977 | 0.6813 | 0.4724 | 0.771 | 0.4841 | 0.2117 | 0.4729 | 0.7396 | 0.1573 | 0.515 | 0.6135 | 0.4351 | 0.6213 | 0.8097 | 0.6098 | 0.701 | 0.3351 | 0.526 |
0.6532 | 40.0 | 29720 | 0.6797 | 0.4743 | 0.7751 | 0.4848 | 0.2137 | 0.4761 | 0.7369 | 0.1573 | 0.516 | 0.6149 | 0.4354 | 0.6243 | 0.8077 | 0.6101 | 0.7019 | 0.3384 | 0.5279 |
0.6475 | 41.0 | 30463 | 0.6769 | 0.4755 | 0.773 | 0.4903 | 0.219 | 0.4742 | 0.7397 | 0.1572 | 0.5193 | 0.6169 | 0.4418 | 0.6248 | 0.8088 | 0.6125 | 0.7044 | 0.3384 | 0.5295 |
0.6432 | 42.0 | 31206 | 0.6779 | 0.4762 | 0.7757 | 0.4926 | 0.2171 | 0.4777 | 0.739 | 0.158 | 0.5184 | 0.6168 | 0.4384 | 0.6262 | 0.8079 | 0.6122 | 0.703 | 0.3403 | 0.5305 |
0.6482 | 43.0 | 31949 | 0.6762 | 0.4759 | 0.7756 | 0.4897 | 0.218 | 0.4755 | 0.74 | 0.1579 | 0.5169 | 0.6141 | 0.4329 | 0.624 | 0.8071 | 0.6132 | 0.7042 | 0.3385 | 0.524 |
0.6427 | 44.0 | 32692 | 0.6744 | 0.4771 | 0.776 | 0.49 | 0.2167 | 0.4766 | 0.7445 | 0.1591 | 0.5195 | 0.6159 | 0.4333 | 0.6258 | 0.8112 | 0.616 | 0.7064 | 0.3382 | 0.5254 |
0.6409 | 45.0 | 33435 | 0.6758 | 0.4767 | 0.777 | 0.4882 | 0.2189 | 0.4762 | 0.7426 | 0.1581 | 0.5181 | 0.6155 | 0.437 | 0.6239 | 0.8099 | 0.6141 | 0.7046 | 0.3393 | 0.5264 |
0.6361 | 46.0 | 34178 | 0.6748 | 0.4758 | 0.7762 | 0.4888 | 0.2178 | 0.4744 | 0.7448 | 0.1577 | 0.5177 | 0.6139 | 0.4299 | 0.6234 | 0.8116 | 0.6135 | 0.704 | 0.338 | 0.5238 |
0.6383 | 47.0 | 34921 | 0.6757 | 0.475 | 0.7788 | 0.4883 | 0.217 | 0.4751 | 0.7424 | 0.158 | 0.5184 | 0.6139 | 0.4278 | 0.6244 | 0.8116 | 0.6115 | 0.7031 | 0.3384 | 0.5247 |
0.6421 | 48.0 | 35664 | 0.6717 | 0.4793 | 0.7796 | 0.4909 | 0.2217 | 0.4788 | 0.7447 | 0.1589 | 0.5208 | 0.6186 | 0.4413 | 0.627 | 0.8114 | 0.6161 | 0.7071 | 0.3426 | 0.5301 |
0.6357 | 49.0 | 36407 | 0.6712 | 0.4789 | 0.7787 | 0.4916 | 0.2215 | 0.4789 | 0.7425 | 0.1592 | 0.5219 | 0.6188 | 0.4403 | 0.6279 | 0.8114 | 0.6161 | 0.7069 | 0.3418 | 0.5308 |
0.6322 | 50.0 | 37150 | 0.6715 | 0.4792 | 0.7795 | 0.4922 | 0.2219 | 0.4792 | 0.7436 | 0.1587 | 0.5223 | 0.6188 | 0.4368 | 0.629 | 0.8124 | 0.6174 | 0.7074 | 0.3409 | 0.5302 |
0.6324 | 51.0 | 37893 | 0.6729 | 0.478 | 0.7787 | 0.4906 | 0.2206 | 0.4772 | 0.7447 | 0.1585 | 0.5202 | 0.6171 | 0.4379 | 0.6254 | 0.8126 | 0.6153 | 0.7048 | 0.3407 | 0.5293 |
0.6402 | 52.0 | 38636 | 0.6707 | 0.4806 | 0.7792 | 0.4978 | 0.2222 | 0.4795 | 0.747 | 0.1592 | 0.5221 | 0.6196 | 0.4419 | 0.6278 | 0.8135 | 0.6174 | 0.7076 | 0.3438 | 0.5317 |
0.6328 | 53.0 | 39379 | 0.6716 | 0.4796 | 0.7794 | 0.4964 | 0.2231 | 0.4789 | 0.7445 | 0.1587 | 0.5212 | 0.6184 | 0.4405 | 0.6269 | 0.812 | 0.6173 | 0.707 | 0.342 | 0.5299 |
0.6349 | 54.0 | 40122 | 0.6715 | 0.4795 | 0.7796 | 0.4941 | 0.223 | 0.4782 | 0.7453 | 0.1587 | 0.5216 | 0.6186 | 0.4399 | 0.6268 | 0.8135 | 0.6165 | 0.7066 | 0.3425 | 0.5305 |
0.6293 | 55.0 | 40865 | 0.6705 | 0.4798 | 0.779 | 0.4921 | 0.2232 | 0.479 | 0.7445 | 0.159 | 0.5222 | 0.6192 | 0.4408 | 0.628 | 0.8123 | 0.6177 | 0.7073 | 0.3419 | 0.5311 |
0.6324 | 56.0 | 41608 | 0.6705 | 0.4804 | 0.78 | 0.4939 | 0.2238 | 0.48 | 0.7446 | 0.1588 | 0.5222 | 0.6198 | 0.4418 | 0.6285 | 0.8127 | 0.618 | 0.7079 | 0.3428 | 0.5318 |
0.6293 | 57.0 | 42351 | 0.6702 | 0.4803 | 0.7796 | 0.4947 | 0.2235 | 0.4792 | 0.7452 | 0.159 | 0.5228 | 0.6197 | 0.4415 | 0.6283 | 0.813 | 0.6178 | 0.708 | 0.3428 | 0.5314 |
0.6353 | 58.0 | 43094 | 0.6701 | 0.4804 | 0.7798 | 0.4943 | 0.224 | 0.4795 | 0.7444 | 0.1588 | 0.5223 | 0.6198 | 0.4422 | 0.6284 | 0.8128 | 0.6183 | 0.7082 | 0.3424 | 0.5315 |
0.6323 | 59.0 | 43837 | 0.6703 | 0.4803 | 0.78 | 0.4935 | 0.2238 | 0.4794 | 0.7443 | 0.1586 | 0.5223 | 0.6196 | 0.4419 | 0.6282 | 0.8124 | 0.6183 | 0.7082 | 0.3423 | 0.5309 |
0.6384 | 60.0 | 44580 | 0.6703 | 0.4803 | 0.7799 | 0.494 | 0.2239 | 0.4794 | 0.7442 | 0.1586 | 0.5224 | 0.6196 | 0.4422 | 0.6281 | 0.8123 | 0.6182 | 0.7081 | 0.3423 | 0.5311 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 43
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for toukapy/detr_finetuned_kitti_mots-bright
Base model
microsoft/conditional-detr-resnet-50