End of training
Browse files
README.md
CHANGED
@@ -1,11 +1,11 @@
|
|
1 |
---
|
2 |
-
base_model: microsoft/git-large-r-coco
|
3 |
-
datasets:
|
4 |
-
- imagefolder
|
5 |
library_name: transformers
|
6 |
license: mit
|
|
|
7 |
tags:
|
8 |
- generated_from_trainer
|
|
|
|
|
9 |
model-index:
|
10 |
- name: git-large-r-coco-IDB_ADv1_COCOv6-rv2
|
11 |
results: []
|
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
18 |
|
19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
-
- Loss:
|
22 |
-
- Meteor Score: {'meteor': 0.
|
23 |
|
24 |
## Model description
|
25 |
|
@@ -42,82 +42,36 @@ The following hyperparameters were used during training:
|
|
42 |
- train_batch_size: 6
|
43 |
- eval_batch_size: 8
|
44 |
- seed: 42
|
45 |
-
- gradient_accumulation_steps:
|
46 |
-
- total_train_batch_size:
|
47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
48 |
-
- lr_scheduler_type:
|
49 |
-
- lr_scheduler_warmup_steps:
|
50 |
-
- num_epochs:
|
51 |
- mixed_precision_training: Native AMP
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
-
| Training Loss | Epoch | Step | Validation Loss | Meteor Score
|
56 |
-
|
57 |
-
|
|
58 |
-
|
|
59 |
-
|
|
60 |
-
|
|
61 |
-
|
|
62 |
-
|
|
63 |
-
|
|
64 |
-
|
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
| 35.
|
69 |
-
|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
| 30.8725 | 23.75 | 95 | 7.5146 | {'meteor': 0.09846856906235638} |
|
76 |
-
| 30.2577 | 25.0 | 100 | 7.3801 | {'meteor': 0.10276759760059294} |
|
77 |
-
| 29.7474 | 26.25 | 105 | 7.2507 | {'meteor': 0.10613210049792668} |
|
78 |
-
| 29.2894 | 27.5 | 110 | 7.1239 | {'meteor': 0.13790065127920864} |
|
79 |
-
| 28.6819 | 28.75 | 115 | 7.0016 | {'meteor': 0.1499525264660323} |
|
80 |
-
| 28.2766 | 30.0 | 120 | 6.8826 | {'meteor': 0.1623889673105055} |
|
81 |
-
| 27.8368 | 31.25 | 125 | 6.7674 | {'meteor': 0.16558142946846452} |
|
82 |
-
| 27.2363 | 32.5 | 130 | 6.6516 | {'meteor': 0.18067707147696366} |
|
83 |
-
| 26.9104 | 33.75 | 135 | 6.5395 | {'meteor': 0.1902076055826109} |
|
84 |
-
| 26.3969 | 35.0 | 140 | 6.4219 | {'meteor': 0.19967391183196687} |
|
85 |
-
| 25.9642 | 36.25 | 145 | 6.3067 | {'meteor': 0.20491030194678278} |
|
86 |
-
| 25.4261 | 37.5 | 150 | 6.1895 | {'meteor': 0.24049138001924492} |
|
87 |
-
| 25.0158 | 38.75 | 155 | 6.0675 | {'meteor': 0.27280517248636943} |
|
88 |
-
| 24.5341 | 40.0 | 160 | 5.9455 | {'meteor': 0.31174917426046617} |
|
89 |
-
| 24.0207 | 41.25 | 165 | 5.8232 | {'meteor': 0.3346296347468463} |
|
90 |
-
| 23.5043 | 42.5 | 170 | 5.6940 | {'meteor': 0.3640092970358211} |
|
91 |
-
| 23.0053 | 43.75 | 175 | 5.5694 | {'meteor': 0.39298077183250996} |
|
92 |
-
| 22.5147 | 45.0 | 180 | 5.4367 | {'meteor': 0.4101475012845666} |
|
93 |
-
| 21.9149 | 46.25 | 185 | 5.3064 | {'meteor': 0.4454252489181049} |
|
94 |
-
| 21.4485 | 47.5 | 190 | 5.1693 | {'meteor': 0.4488998242275543} |
|
95 |
-
| 20.9062 | 48.75 | 195 | 5.0282 | {'meteor': 0.45517787205320787} |
|
96 |
-
| 20.2946 | 50.0 | 200 | 4.8857 | {'meteor': 0.4670168114658528} |
|
97 |
-
| 19.7408 | 51.25 | 205 | 4.7424 | {'meteor': 0.46885187008247564} |
|
98 |
-
| 19.1833 | 52.5 | 210 | 4.5962 | {'meteor': 0.48009210397830365} |
|
99 |
-
| 18.5146 | 53.75 | 215 | 4.4414 | {'meteor': 0.48190118690818967} |
|
100 |
-
| 17.9353 | 55.0 | 220 | 4.2908 | {'meteor': 0.47995665529044884} |
|
101 |
-
| 17.3241 | 56.25 | 225 | 4.1343 | {'meteor': 0.4585172700828576} |
|
102 |
-
| 16.6322 | 57.5 | 230 | 3.9754 | {'meteor': 0.49224016290891875} |
|
103 |
-
| 16.0912 | 58.75 | 235 | 3.8126 | {'meteor': 0.4905347101207873} |
|
104 |
-
| 15.3454 | 60.0 | 240 | 3.6444 | {'meteor': 0.5111816415314908} |
|
105 |
-
| 14.701 | 61.25 | 245 | 3.4792 | {'meteor': 0.5105523817202849} |
|
106 |
-
| 13.9865 | 62.5 | 250 | 3.3082 | {'meteor': 0.5129290396554559} |
|
107 |
-
| 13.3669 | 63.75 | 255 | 3.1352 | {'meteor': 0.5120252275835627} |
|
108 |
-
| 12.6031 | 65.0 | 260 | 2.9628 | {'meteor': 0.5227148858383636} |
|
109 |
-
| 11.9116 | 66.25 | 265 | 2.7814 | {'meteor': 0.5334414700925539} |
|
110 |
-
| 11.1547 | 67.5 | 270 | 2.6098 | {'meteor': 0.529874570496232} |
|
111 |
-
| 10.502 | 68.75 | 275 | 2.4253 | {'meteor': 0.5417984215155089} |
|
112 |
-
| 9.7295 | 70.0 | 280 | 2.2425 | {'meteor': 0.5539765230302648} |
|
113 |
-
| 9.0068 | 71.25 | 285 | 2.0664 | {'meteor': 0.5482088952549299} |
|
114 |
-
| 8.2511 | 72.5 | 290 | 1.8853 | {'meteor': 0.5556276338633395} |
|
115 |
-
| 7.5647 | 73.75 | 295 | 1.7105 | {'meteor': 0.5638295155858636} |
|
116 |
-
| 6.8195 | 75.0 | 300 | 1.5389 | {'meteor': 0.5592943545681} |
|
117 |
-
| 6.1153 | 76.25 | 305 | 1.3699 | {'meteor': 0.5635556260319741} |
|
118 |
-
| 5.4348 | 77.5 | 310 | 1.2120 | {'meteor': 0.5634094242533236} |
|
119 |
-
| 4.7841 | 78.75 | 315 | 1.0553 | {'meteor': 0.5515314216323808} |
|
120 |
-
| 4.2107 | 80.0 | 320 | 0.9145 | {'meteor': 0.5670195626869615} |
|
121 |
|
122 |
|
123 |
### Framework versions
|
|
|
1 |
---
|
|
|
|
|
|
|
2 |
library_name: transformers
|
3 |
license: mit
|
4 |
+
base_model: microsoft/git-large-r-coco
|
5 |
tags:
|
6 |
- generated_from_trainer
|
7 |
+
datasets:
|
8 |
+
- imagefolder
|
9 |
model-index:
|
10 |
- name: git-large-r-coco-IDB_ADv1_COCOv6-rv2
|
11 |
results: []
|
|
|
18 |
|
19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 1.2960
|
22 |
+
- Meteor Score: {'meteor': 0.6500893825313575}
|
23 |
|
24 |
## Model description
|
25 |
|
|
|
42 |
- train_batch_size: 6
|
43 |
- eval_batch_size: 8
|
44 |
- seed: 42
|
45 |
+
- gradient_accumulation_steps: 16
|
46 |
+
- total_train_batch_size: 96
|
47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
48 |
+
- lr_scheduler_type: cosine
|
49 |
+
- lr_scheduler_warmup_steps: 50
|
50 |
+
- num_epochs: 90
|
51 |
- mixed_precision_training: Native AMP
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Meteor Score |
|
56 |
+
|:-------------:|:-----:|:----:|:---------------:|:------------------------------:|
|
57 |
+
| 66.7134 | 5.0 | 5 | 4.0753 | {'meteor': 0.5993281464976976} |
|
58 |
+
| 66.485 | 10.0 | 10 | 4.0433 | {'meteor': 0.6089663843576959} |
|
59 |
+
| 65.6963 | 15.0 | 15 | 3.9786 | {'meteor': 0.6095333838712824} |
|
60 |
+
| 64.3776 | 20.0 | 20 | 3.8813 | {'meteor': 0.6134190979436022} |
|
61 |
+
| 62.5373 | 25.0 | 25 | 3.7532 | {'meteor': 0.6085557200756545} |
|
62 |
+
| 60.1653 | 30.0 | 30 | 3.5894 | {'meteor': 0.6101478691549013} |
|
63 |
+
| 57.4769 | 35.0 | 35 | 3.3996 | {'meteor': 0.6216777032100391} |
|
64 |
+
| 54.0515 | 40.0 | 40 | 3.1708 | {'meteor': 0.6117253911082575} |
|
65 |
+
| 50.0954 | 45.0 | 45 | 2.9117 | {'meteor': 0.6318492684705416} |
|
66 |
+
| 45.6823 | 50.0 | 50 | 2.6244 | {'meteor': 0.628492202016849} |
|
67 |
+
| 40.8434 | 55.0 | 55 | 2.3222 | {'meteor': 0.6366743166997665} |
|
68 |
+
| 35.9606 | 60.0 | 60 | 2.0199 | {'meteor': 0.6428147388556926} |
|
69 |
+
| 31.3121 | 65.0 | 65 | 1.7651 | {'meteor': 0.6421314637392068} |
|
70 |
+
| 27.5112 | 70.0 | 70 | 1.5673 | {'meteor': 0.6483513899407666} |
|
71 |
+
| 24.653 | 75.0 | 75 | 1.4289 | {'meteor': 0.647761821003933} |
|
72 |
+
| 22.7509 | 80.0 | 80 | 1.3469 | {'meteor': 0.6445735067208078} |
|
73 |
+
| 21.6869 | 85.0 | 85 | 1.3067 | {'meteor': 0.651431012014952} |
|
74 |
+
| 21.2456 | 90.0 | 90 | 1.2960 | {'meteor': 0.6500893825313575} |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
75 |
|
76 |
|
77 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1576851440
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c0ee9fb20d0603304bfd91378b4357df15a44b578858219d488d246240438402
|
3 |
size 1576851440
|
runs/Jul18_18-42-29_OZPC/events.out.tfevents.1752885754.OZPC.5052.3
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c67a551c7b5b35df9d1eb10f1a694cc77469df9be80d21b09530d477532f9fee
|
3 |
+
size 14237
|