End of training
Browse files
README.md
CHANGED
@@ -1,11 +1,11 @@
|
|
1 |
---
|
2 |
-
base_model: microsoft/git-large-r-coco
|
3 |
-
datasets:
|
4 |
-
- imagefolder
|
5 |
library_name: transformers
|
6 |
license: mit
|
|
|
7 |
tags:
|
8 |
- generated_from_trainer
|
|
|
|
|
9 |
model-index:
|
10 |
- name: git-large-r-coco-IDB_ADv1_COCOv6-r
|
11 |
results: []
|
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
18 |
|
19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
-
- Loss:
|
22 |
-
- Meteor Score: {'meteor': 0.
|
23 |
|
24 |
## Model description
|
25 |
|
@@ -47,43 +47,33 @@ The following hyperparameters were used during training:
|
|
47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
48 |
- lr_scheduler_type: cosine
|
49 |
- lr_scheduler_warmup_steps: 5
|
50 |
-
- num_epochs:
|
51 |
- mixed_precision_training: Native AMP
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
-
| Training Loss | Epoch | Step | Validation Loss | Meteor Score
|
56 |
-
|
57 |
-
|
|
58 |
-
|
|
59 |
-
|
|
60 |
-
|
|
61 |
-
|
|
62 |
-
|
|
63 |
-
|
|
64 |
-
|
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
|
|
69 |
-
|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
| 0.003 | 70.0 | 105 | 0.0758 | {'meteor': 0.5141283140032965} |
|
78 |
-
| 0.0031 | 73.3333 | 110 | 0.0759 | {'meteor': 0.5147414640891067} |
|
79 |
-
| 0.0033 | 77.0 | 115 | 0.0759 | {'meteor': 0.5143698813094443} |
|
80 |
-
| 0.0026 | 80.0 | 120 | 0.0758 | {'meteor': 0.5133582948399001} |
|
81 |
-
| 0.0025 | 83.0 | 125 | 0.0758 | {'meteor': 0.5121935317618276} |
|
82 |
-
| 0.003 | 86.6667 | 130 | 0.0759 | {'meteor': 0.5121665250079868} |
|
83 |
-
| 0.0026 | 90.0 | 135 | 0.0759 | {'meteor': 0.5116741831961337} |
|
84 |
-
| 0.0026 | 93.3333 | 140 | 0.0759 | {'meteor': 0.512031945155839} |
|
85 |
-
| 0.003 | 97.0 | 145 | 0.0759 | {'meteor': 0.5120926382524592} |
|
86 |
-
| 0.0024 | 100.0 | 150 | 0.0759 | {'meteor': 0.5120753072373921} |
|
87 |
|
88 |
|
89 |
### Framework versions
|
|
|
1 |
---
|
|
|
|
|
|
|
2 |
library_name: transformers
|
3 |
license: mit
|
4 |
+
base_model: microsoft/git-large-r-coco
|
5 |
tags:
|
6 |
- generated_from_trainer
|
7 |
+
datasets:
|
8 |
+
- imagefolder
|
9 |
model-index:
|
10 |
- name: git-large-r-coco-IDB_ADv1_COCOv6-r
|
11 |
results: []
|
|
|
18 |
|
19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
20 |
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 2.8325
|
22 |
+
- Meteor Score: {'meteor': 0.44716808145000575}
|
23 |
|
24 |
## Model description
|
25 |
|
|
|
47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
48 |
- lr_scheduler_type: cosine
|
49 |
- lr_scheduler_warmup_steps: 5
|
50 |
+
- num_epochs: 100
|
51 |
- mixed_precision_training: Native AMP
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Meteor Score |
|
56 |
+
|:-------------:|:-------:|:----:|:---------------:|:-------------------------------:|
|
57 |
+
| 82.9228 | 3.0 | 5 | 10.5760 | {'meteor': 0.05763852785320578} |
|
58 |
+
| 89.2818 | 6.6667 | 10 | 9.2973 | {'meteor': 0.0566230094484481} |
|
59 |
+
| 71.0589 | 10.0 | 15 | 7.9880 | {'meteor': 0.06313383286499662} |
|
60 |
+
| 62.1814 | 13.3333 | 20 | 7.1226 | {'meteor': 0.11846920353982553} |
|
61 |
+
| 61.4373 | 17.0 | 25 | 6.4914 | {'meteor': 0.13491195786509014} |
|
62 |
+
| 45.9064 | 20.0 | 30 | 5.9411 | {'meteor': 0.16754599930031555} |
|
63 |
+
| 42.5802 | 23.0 | 35 | 5.4427 | {'meteor': 0.1915275196011832} |
|
64 |
+
| 47.7153 | 26.6667 | 40 | 4.9903 | {'meteor': 0.32374710181123983} |
|
65 |
+
| 39.6858 | 30.0 | 45 | 4.5799 | {'meteor': 0.3415089427560285} |
|
66 |
+
| 36.5271 | 33.3333 | 50 | 4.2147 | {'meteor': 0.37617485564825703} |
|
67 |
+
| 36.9405 | 37.0 | 55 | 3.8945 | {'meteor': 0.3963594699614644} |
|
68 |
+
| 28.0545 | 40.0 | 60 | 3.6215 | {'meteor': 0.40938786947486} |
|
69 |
+
| 26.4561 | 43.0 | 65 | 3.3945 | {'meteor': 0.42123476340997434} |
|
70 |
+
| 30.432 | 46.6667 | 70 | 3.2123 | {'meteor': 0.4147437564703583} |
|
71 |
+
| 26.2868 | 50.0 | 75 | 3.0723 | {'meteor': 0.4409017964342259} |
|
72 |
+
| 25.2682 | 53.3333 | 80 | 2.9708 | {'meteor': 0.4466618012682805} |
|
73 |
+
| 26.9821 | 57.0 | 85 | 2.9025 | {'meteor': 0.44590994086182606} |
|
74 |
+
| 21.6659 | 60.0 | 90 | 2.8609 | {'meteor': 0.4467688108057412} |
|
75 |
+
| 21.4648 | 63.0 | 95 | 2.8398 | {'meteor': 0.44716808145000575} |
|
76 |
+
| 26.1056 | 66.6667 | 100 | 2.8325 | {'meteor': 0.44716808145000575} |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
77 |
|
78 |
|
79 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1576851440
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6c41cebafa85c94d2581fb906c04f719f4a458febbb4068770727b9831d7cb56
|
3 |
size 1576851440
|
runs/Jul16_07-06-21_OZPC/events.out.tfevents.1752671189.OZPC.19412.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:aab28107b535b4b54340080e489a2b80eb94371d592f9428476ca182344e8251
|
3 |
+
size 15220
|