File size: 5,050 Bytes
ccc7903
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0a88d31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ccc7903
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
077f178
 
ccc7903
 
 
ca0f0bb
ccc7903
0a88d31
ccc7903
 
 
 
 
0a88d31
 
 
 
 
 
 
 
 
 
 
 
ccc7903
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
---
library_name: transformers
license: apache-2.0
base_model: PekingU/rtdetr_v2_r50vd
tags:
- generated_from_trainer
model-index:
- name: rtdetr-v2-r50-cppe5-finetune-2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# rtdetr-v2-r50-cppe5-finetune-2

This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 10.6505
- Map: 0.4323
- Map 50: 0.8963
- Map 75: 0.3219
- Map Small: 0.3756
- Map Medium: 0.5476
- Map Large: 0.7106
- Mar 1: 0.2852
- Mar 10: 0.4788
- Mar 100: 0.5715
- Mar Small: 0.5305
- Mar Medium: 0.6698
- Mar Large: 0.7583
- Map Football: 0.4833
- Mar 100 Football: 0.5867
- Map Player: 0.3814
- Mar 100 Player: 0.5564

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 300
- num_epochs: 12

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Football | Mar 100 Football | Map Player | Mar 100 Player |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:----------:|:--------------:|
| No log        | 1.0   | 62   | 17.5576         | 0.1072 | 0.2555 | 0.0663 | 0.0878    | 0.1914     | 0.1325    | 0.0543 | 0.208  | 0.3404  | 0.2858    | 0.4734     | 0.5447    | 0.0102       | 0.2166           | 0.2042     | 0.4642         |
| No log        | 2.0   | 124  | 9.5985          | 0.3615 | 0.8054 | 0.2547 | 0.3082    | 0.5041     | 0.5671    | 0.2189 | 0.4346 | 0.5265  | 0.4739    | 0.6782     | 0.6455    | 0.3255       | 0.4828           | 0.3975     | 0.5702         |
| No log        | 3.0   | 186  | 9.6654          | 0.3647 | 0.8181 | 0.2656 | 0.3115    | 0.5271     | 0.6       | 0.2353 | 0.4404 | 0.5339  | 0.4772    | 0.685      | 0.7447    | 0.3696       | 0.5225           | 0.3597     | 0.5452         |
| No log        | 4.0   | 248  | 10.0423         | 0.3637 | 0.8276 | 0.2328 | 0.3011    | 0.5362     | 0.6091    | 0.2209 | 0.4219 | 0.5103  | 0.4515    | 0.674      | 0.7477    | 0.345        | 0.474            | 0.3823     | 0.5467         |
| No log        | 5.0   | 310  | 10.2540         | 0.3952 | 0.8224 | 0.3201 | 0.3304    | 0.5617     | 0.6733    | 0.2556 | 0.4589 | 0.548   | 0.4924    | 0.6929     | 0.7598    | 0.4203       | 0.5509           | 0.3702     | 0.5452         |
| No log        | 6.0   | 372  | 10.4936         | 0.3862 | 0.8134 | 0.3167 | 0.3148    | 0.5569     | 0.6616    | 0.2459 | 0.4368 | 0.5254  | 0.4659    | 0.6818     | 0.797     | 0.3995       | 0.5089           | 0.373      | 0.5418         |
| No log        | 7.0   | 434  | 10.6991         | 0.4119 | 0.8405 | 0.3383 | 0.348     | 0.5668     | 0.6627    | 0.2615 | 0.4528 | 0.5379  | 0.4824    | 0.6929     | 0.7515    | 0.4282       | 0.5219           | 0.3955     | 0.5539         |
| No log        | 8.0   | 496  | 10.7472         | 0.4216 | 0.8338 | 0.3662 | 0.3592    | 0.5737     | 0.6884    | 0.2668 | 0.4624 | 0.5513  | 0.4966    | 0.701      | 0.7561    | 0.4488       | 0.5438           | 0.3943     | 0.5588         |
| 19.6326       | 9.0   | 558  | 10.9720         | 0.3984 | 0.8353 | 0.3183 | 0.3307    | 0.5564     | 0.7087    | 0.2605 | 0.4449 | 0.5241  | 0.4658    | 0.6661     | 0.797     | 0.4371       | 0.5302           | 0.3598     | 0.518          |
| 19.6326       | 10.0  | 620  | 10.9521         | 0.4117 | 0.8392 | 0.3436 | 0.3458    | 0.569      | 0.6785    | 0.2584 | 0.4495 | 0.5315  | 0.4753    | 0.6815     | 0.75      | 0.436        | 0.5195           | 0.3873     | 0.5435         |
| 19.6326       | 11.0  | 682  | 11.0008         | 0.4158 | 0.8449 | 0.3501 | 0.3509    | 0.5711     | 0.6707    | 0.2626 | 0.4517 | 0.5369  | 0.4837    | 0.6821     | 0.7447    | 0.4374       | 0.5237           | 0.3942     | 0.5502         |
| 19.6326       | 12.0  | 744  | 11.0544         | 0.411  | 0.8363 | 0.3415 | 0.3436    | 0.5727     | 0.6689    | 0.259  | 0.4496 | 0.5331  | 0.4775    | 0.6857     | 0.7439    | 0.4336       | 0.5225           | 0.3883     | 0.5438         |


### Framework versions

- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.1