rtdetr-v2-r18-chkbx

This model is a fine-tuned version of PekingU/rtdetr_v2_r18vd on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.4842
  • Map: 0.5987
  • Map 50: 0.8738
  • Map 75: 0.7039
  • Map Small: 0.502
  • Map Medium: 0.7704
  • Map Large: 0.7529
  • Mar 1: 0.2319
  • Mar 10: 0.5636
  • Mar 100: 0.7272
  • Mar Small: 0.6541
  • Mar Medium: 0.8382
  • Mar Large: 0.8409
  • Map Checked-unchecked: -1.0
  • Mar 100 Checked-unchecked: -1.0
  • Map Checked: 0.5973
  • Mar 100 Checked: 0.7657
  • Map Unchecked: 0.6001
  • Mar 100 Unchecked: 0.6886

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 300
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Checked Map Checked-unchecked Map Large Map Medium Map Small Map Unchecked Mar 1 Mar 10 Mar 100 Mar 100 Checked Mar 100 Checked-unchecked Mar 100 Unchecked Mar Large Mar Medium Mar Small
No log 1.0 45 41.6291 0.0006 0.0021 0.0003 0.0011 -1.0 0.001 0.0007 0.0008 0.0001 0.001 0.0168 0.0694 0.1106 -1.0 0.0281 0.0384 0.1169 0.0552
No log 2.0 90 22.2922 0.0262 0.0434 0.0271 0.0286 -1.0 0.011 0.0979 0.0111 0.0237 0.0527 0.1254 0.1616 0.2049 -1.0 0.1182 0.2451 0.314 0.0793
No log 3.0 135 13.8175 0.1248 0.1961 0.1332 0.1221 -1.0 0.1424 0.236 0.1057 0.1275 0.1264 0.2977 0.4227 0.471 -1.0 0.3744 0.546 0.6483 0.3003
No log 4.0 180 8.8445 0.2482 0.3986 0.2703 0.254 -1.0 0.2724 0.3975 0.2201 0.2423 0.1658 0.4436 0.6224 0.7173 -1.0 0.5276 0.7228 0.8022 0.5239
No log 5.0 225 6.7102 0.4177 0.6659 0.4639 0.4061 -1.0 0.4542 0.6142 0.3615 0.4293 0.1899 0.4831 0.6874 0.7353 -1.0 0.6395 0.8224 0.7959 0.6124
No log 6.0 270 6.0112 0.4191 0.6537 0.4843 0.3641 -1.0 0.4843 0.6275 0.3755 0.4741 0.2018 0.5005 0.693 0.7417 -1.0 0.6443 0.8185 0.8044 0.619
No log 7.0 315 5.4113 0.4432 0.6864 0.5161 0.4496 -1.0 0.559 0.674 0.3883 0.4368 0.2113 0.5198 0.689 0.7297 -1.0 0.6483 0.8223 0.8073 0.6095
No log 8.0 360 5.0758 0.4997 0.7682 0.5818 0.4962 -1.0 0.6691 0.7045 0.4303 0.5032 0.2149 0.5279 0.7027 0.7512 -1.0 0.6543 0.8327 0.8125 0.6291
No log 9.0 405 4.9946 0.5426 0.8212 0.6339 0.562 -1.0 0.7287 0.7152 0.4604 0.5233 0.2206 0.5423 0.7181 0.7647 -1.0 0.6716 0.8556 0.8252 0.6441
No log 10.0 450 4.8400 0.5308 0.8058 0.6155 0.538 -1.0 0.7092 0.7136 0.4281 0.5237 0.2196 0.5293 0.7044 0.7611 -1.0 0.6477 0.8416 0.8136 0.6325
No log 11.0 495 4.6875 0.535 0.8049 0.6239 0.5128 -1.0 0.6637 0.7219 0.4634 0.5571 0.2244 0.5421 0.7196 0.7597 -1.0 0.6795 0.8391 0.8188 0.651
16.3849 12.0 540 4.7347 0.5365 0.8085 0.6182 0.5084 -1.0 0.7087 0.7386 0.4567 0.5646 0.2252 0.5536 0.7213 0.7682 -1.0 0.6744 0.857 0.8287 0.647
16.3849 13.0 585 4.6583 0.5669 0.8287 0.6625 0.5498 -1.0 0.7286 0.7453 0.4726 0.5839 0.2274 0.5552 0.7236 0.7597 -1.0 0.6875 0.8368 0.8269 0.6547
16.3849 14.0 630 4.5310 0.5708 0.8394 0.6632 0.5633 -1.0 0.7665 0.7367 0.4886 0.5784 0.2254 0.56 0.7291 0.7682 -1.0 0.6901 0.8649 0.8318 0.6569
16.3849 15.0 675 4.4842 0.5987 0.8738 0.7039 0.5973 -1.0 0.7529 0.7704 0.502 0.6001 0.2319 0.5636 0.7272 0.7657 -1.0 0.6886 0.8409 0.8382 0.6541
16.3849 16.0 720 4.5336 0.5882 0.8529 0.6968 0.5864 -1.0 0.7495 0.7633 0.4995 0.5899 0.2263 0.5633 0.7353 0.7795 -1.0 0.6912 0.8576 0.8395 0.6654
16.3849 17.0 765 4.5010 0.5604 0.8143 0.6583 0.5618 -1.0 0.7133 0.758 0.4845 0.5591 0.2297 0.5573 0.7329 0.776 -1.0 0.6898 0.8597 0.8434 0.6578
16.3849 18.0 810 4.4966 0.5878 0.8533 0.6936 0.4993 0.7672 0.7421 0.2325 0.5644 0.7324 0.6614 0.8398 0.8402 -1.0 -1.0 0.5818 0.7724 0.5937 0.6923
16.3849 19.0 855 4.5446 0.5836 0.8469 0.6881 0.4941 0.7627 0.7538 0.2295 0.5633 0.7325 0.6591 0.84 0.8559 -1.0 -1.0 0.5747 0.7693 0.5924 0.6957
16.3849 20.0 900 4.5455 0.589 0.8579 0.6926 0.4971 0.7623 0.7609 0.2326 0.5651 0.7338 0.6618 0.8389 0.8517 -1.0 -1.0 0.5771 0.7728 0.6008 0.6949

Framework versions

  • Transformers 4.56.0
  • Pytorch 2.8.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.0
Downloads last month
20
Safetensors
Model size
20.1M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for wendys-llc/rtdetr-v2-r18-chkbx

Finetuned
(9)
this model

Dataset used to train wendys-llc/rtdetr-v2-r18-chkbx