File size: 6,299 Bytes
b7926c8
 
 
466fabf
b7926c8
 
 
 
 
 
 
 
 
 
 
 
466fabf
f6d38a6
466fabf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b7926c8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
466fabf
f6d38a6
 
 
466fabf
 
 
 
 
 
 
 
 
 
 
 
f6d38a6
b7926c8
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---
library_name: transformers
license: apache-2.0
base_model: PekingU/rtdetr_r50vd_coco_o365
tags:
- generated_from_trainer
model-index:
- name: rtdetr-r50-cppe5-finetune
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# rtdetr-r50-cppe5-finetune

This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 9.8586
- Map: 0.5282
- Map 50: 0.6578
- Map 75: 0.5509
- Map Small: 0.2525
- Map Medium: 0.502
- Map Large: 0.6946
- Mar 1: 0.2808
- Mar 10: 0.617
- Mar 100: 0.7372
- Mar Small: 0.423
- Mar Medium: 0.7109
- Mar Large: 0.8923
- Map Apple: 0.5218
- Mar 100 Apple: 0.7284
- Map Banana: 0.4594
- Mar 100 Banana: 0.7377
- Map Grapes: 0.3957
- Mar 100 Grapes: 0.6437
- Map Orange: 0.5229
- Mar 100 Orange: 0.6667
- Map Pineapple: 0.6214
- Mar 100 Pineapple: 0.8087
- Map Watermelon: 0.648
- Mar 100 Watermelon: 0.8381

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Apple | Mar 100 Apple | Map Banana | Mar 100 Banana | Map Grapes | Mar 100 Grapes | Map Orange | Mar 100 Orange | Map Pineapple | Mar 100 Pineapple | Map Watermelon | Mar 100 Watermelon |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:-------------:|:-----------------:|:--------------:|:------------------:|
| 42.2465       | 1.0   | 750  | 11.9797         | 0.3966 | 0.5058 | 0.417  | 0.1431    | 0.3331     | 0.5748    | 0.2443 | 0.5396 | 0.6893  | 0.3383    | 0.656      | 0.8619    | 0.3978    | 0.6735        | 0.3743     | 0.7125         | 0.2978     | 0.5641         | 0.4102     | 0.6402         | 0.4225        | 0.7685            | 0.4771         | 0.7773             |
| 15.4425       | 2.0   | 1500 | 10.7905         | 0.4461 | 0.5553 | 0.4689 | 0.1701    | 0.3998     | 0.6131    | 0.2634 | 0.5668 | 0.7036  | 0.3638    | 0.663      | 0.8779    | 0.4239    | 0.6906        | 0.437      | 0.7281         | 0.3405     | 0.6118         | 0.4262     | 0.6468         | 0.5435        | 0.7804            | 0.5053         | 0.7636             |
| 14.2856       | 3.0   | 2250 | 9.9898          | 0.4937 | 0.6229 | 0.5166 | 0.2073    | 0.4512     | 0.6644    | 0.2691 | 0.5859 | 0.7224  | 0.4119    | 0.6999     | 0.8802    | 0.4883    | 0.7015        | 0.4771     | 0.7369         | 0.3631     | 0.6162         | 0.4966     | 0.654          | 0.5767        | 0.7971            | 0.5607         | 0.8284             |
| 13.0156       | 4.0   | 3000 | 10.1385         | 0.5064 | 0.6308 | 0.5323 | 0.2148    | 0.4725     | 0.6794    | 0.274  | 0.5986 | 0.7294  | 0.4062    | 0.7103     | 0.8853    | 0.4728    | 0.7104        | 0.4569     | 0.738          | 0.3955     | 0.6261         | 0.5067     | 0.6602         | 0.6041        | 0.8011            | 0.6022         | 0.8403             |
| 12.4118       | 5.0   | 3750 | 10.0754         | 0.5084 | 0.6286 | 0.533  | 0.2254    | 0.4758     | 0.6844    | 0.2754 | 0.6012 | 0.7305  | 0.3992    | 0.7066     | 0.8904    | 0.4911    | 0.7103        | 0.488      | 0.7457         | 0.3875     | 0.6389         | 0.5065     | 0.6658         | 0.5897        | 0.7855            | 0.588          | 0.8366             |
| 11.7444       | 6.0   | 4500 | 10.1131         | 0.5119 | 0.6318 | 0.5379 | 0.209     | 0.477      | 0.6834    | 0.2742 | 0.6055 | 0.7302  | 0.399     | 0.6996     | 0.8898    | 0.4975    | 0.7185        | 0.4644     | 0.7266         | 0.391      | 0.6546         | 0.5165     | 0.6646         | 0.5963        | 0.7989            | 0.6059         | 0.8182             |
| 11.3657       | 7.0   | 5250 | 10.4886         | 0.4898 | 0.608  | 0.5144 | 0.2211    | 0.4666     | 0.6488    | 0.2736 | 0.5901 | 0.7258  | 0.3896    | 0.6946     | 0.8869    | 0.4952    | 0.7158        | 0.4309     | 0.7397         | 0.3444     | 0.6269         | 0.5001     | 0.6587         | 0.5822        | 0.7989            | 0.5859         | 0.8151             |
| 11.0681       | 8.0   | 6000 | 9.8240          | 0.5251 | 0.652  | 0.5511 | 0.2452    | 0.4984     | 0.6922    | 0.2809 | 0.6129 | 0.7389  | 0.4201    | 0.711      | 0.8945    | 0.5171    | 0.7279        | 0.471      | 0.7451         | 0.3935     | 0.6524         | 0.5214     | 0.6668         | 0.6087        | 0.8011            | 0.6388         | 0.8403             |
| 10.7525       | 9.0   | 6750 | 9.8244          | 0.5185 | 0.644  | 0.5425 | 0.2364    | 0.4832     | 0.6893    | 0.2799 | 0.6088 | 0.7399  | 0.4262    | 0.7159     | 0.8938    | 0.5137    | 0.7293        | 0.4548     | 0.753          | 0.3932     | 0.6471         | 0.5181     | 0.6659         | 0.6112        | 0.8047            | 0.6197         | 0.8395             |
| 10.5616       | 10.0  | 7500 | 9.8586          | 0.5282 | 0.6578 | 0.5509 | 0.2525    | 0.502      | 0.6946    | 0.2808 | 0.617  | 0.7372  | 0.423     | 0.7109     | 0.8923    | 0.5218    | 0.7284        | 0.4594     | 0.7377         | 0.3957     | 0.6437         | 0.5229     | 0.6667         | 0.6214        | 0.8087            | 0.648          | 0.8381             |


### Framework versions

- Transformers 4.53.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1