ooliverz commited on
Commit
db029c1
·
verified ·
1 Parent(s): cbb1223

End of training

Browse files
README.md CHANGED
@@ -1,11 +1,11 @@
1
  ---
2
- base_model: microsoft/git-large-r-coco
3
- datasets:
4
- - imagefolder
5
  library_name: transformers
6
  license: mit
 
7
  tags:
8
  - generated_from_trainer
 
 
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-r
11
  results: []
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.0736
22
- - Meteor Score: {'meteor': 0.5130868233592256}
23
 
24
  ## Model description
25
 
@@ -47,39 +47,43 @@ The following hyperparameters were used during training:
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 5
50
- - num_epochs: 130
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
- |:-------------:|:-------:|:----:|:---------------:|:-------------------------------:|
57
- | 28.9552 | 3.0 | 5 | 3.8917 | {'meteor': 0.37839770492261915} |
58
- | 33.7668 | 6.6667 | 10 | 3.3827 | {'meteor': 0.4406376553178631} |
59
- | 26.117 | 10.0 | 15 | 2.8454 | {'meteor': 0.4589591986431842} |
60
- | 21.8173 | 13.3333 | 20 | 2.3319 | {'meteor': 0.4761657478660956} |
61
- | 19.289 | 17.0 | 25 | 1.8524 | {'meteor': 0.4911702612206243} |
62
- | 12.3519 | 20.0 | 30 | 1.4161 | {'meteor': 0.49811670782749606} |
63
- | 9.6969 | 23.0 | 35 | 1.0411 | {'meteor': 0.5095075994488846} |
64
- | 8.6172 | 26.6667 | 40 | 0.7414 | {'meteor': 0.5066143143963856} |
65
- | 5.3936 | 30.0 | 45 | 0.5175 | {'meteor': 0.5048980919229241} |
66
- | 3.7103 | 33.3333 | 50 | 0.3607 | {'meteor': 0.507168142278137} |
67
- | 2.7173 | 37.0 | 55 | 0.2575 | {'meteor': 0.5062808163480844} |
68
- | 1.5217 | 40.0 | 60 | 0.1897 | {'meteor': 0.5110418337180939} |
69
- | 1.1127 | 43.0 | 65 | 0.1479 | {'meteor': 0.5082859864306299} |
70
- | 0.9827 | 46.6667 | 70 | 0.1217 | {'meteor': 0.5040127688652701} |
71
- | 0.6575 | 50.0 | 75 | 0.1029 | {'meteor': 0.5029842113033175} |
72
- | 0.5075 | 53.3333 | 80 | 0.0923 | {'meteor': 0.5103940781342445} |
73
- | 0.4441 | 57.0 | 85 | 0.0830 | {'meteor': 0.515034088697115} |
74
- | 0.2982 | 60.0 | 90 | 0.0791 | {'meteor': 0.5125632503730004} |
75
- | 0.2633 | 63.0 | 95 | 0.0778 | {'meteor': 0.5208474650088444} |
76
- | 0.2813 | 66.6667 | 100 | 0.0744 | {'meteor': 0.5170526013113127} |
77
- | 0.2253 | 70.0 | 105 | 0.0732 | {'meteor': 0.519943396763669} |
78
- | 0.2075 | 73.3333 | 110 | 0.0739 | {'meteor': 0.5161049751923272} |
79
- | 0.2149 | 77.0 | 115 | 0.0739 | {'meteor': 0.5134989277659963} |
80
- | 0.1679 | 80.0 | 120 | 0.0740 | {'meteor': 0.5134042014813657} |
81
- | 0.1646 | 83.0 | 125 | 0.0736 | {'meteor': 0.5129010201765198} |
82
- | 0.1999 | 86.6667 | 130 | 0.0736 | {'meteor': 0.5130868233592256} |
 
 
 
 
83
 
84
 
85
  ### Framework versions
 
1
  ---
 
 
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: microsoft/git-large-r-coco
5
  tags:
6
  - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-r
11
  results: []
 
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.0759
22
+ - Meteor Score: {'meteor': 0.5120753072373921}
23
 
24
  ## Model description
25
 
 
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 5
50
+ - num_epochs: 150
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
+ |:-------------:|:-------:|:----:|:---------------:|:------------------------------:|
57
+ | 0.2014 | 3.0 | 5 | 0.0722 | {'meteor': 0.5065157314467645} |
58
+ | 0.2196 | 6.6667 | 10 | 0.0627 | {'meteor': 0.5305128358337737} |
59
+ | 0.1292 | 10.0 | 15 | 0.0594 | {'meteor': 0.5263979985260087} |
60
+ | 0.0872 | 13.3333 | 20 | 0.0605 | {'meteor': 0.5108520041090149} |
61
+ | 0.0818 | 17.0 | 25 | 0.0582 | {'meteor': 0.5113838555978679} |
62
+ | 0.0531 | 20.0 | 30 | 0.0621 | {'meteor': 0.5103098091443673} |
63
+ | 0.0443 | 23.0 | 35 | 0.0632 | {'meteor': 0.5018489405290812} |
64
+ | 0.0489 | 26.6667 | 40 | 0.0639 | {'meteor': 0.5119666957218931} |
65
+ | 0.0315 | 30.0 | 45 | 0.0648 | {'meteor': 0.5128484629162279} |
66
+ | 0.0245 | 33.3333 | 50 | 0.0674 | {'meteor': 0.5114053511060893} |
67
+ | 0.0213 | 37.0 | 55 | 0.0689 | {'meteor': 0.5103811878007981} |
68
+ | 0.0108 | 40.0 | 60 | 0.0704 | {'meteor': 0.5080035696805529} |
69
+ | 0.0089 | 43.0 | 65 | 0.0712 | {'meteor': 0.5235500043256491} |
70
+ | 0.0088 | 46.6667 | 70 | 0.0730 | {'meteor': 0.5162015377767581} |
71
+ | 0.0079 | 50.0 | 75 | 0.0704 | {'meteor': 0.508295723546233} |
72
+ | 0.0055 | 53.3333 | 80 | 0.0727 | {'meteor': 0.5093121829271419} |
73
+ | 0.0049 | 57.0 | 85 | 0.0739 | {'meteor': 0.5119514909960822} |
74
+ | 0.0033 | 60.0 | 90 | 0.0749 | {'meteor': 0.5106012588241947} |
75
+ | 0.0033 | 63.0 | 95 | 0.0751 | {'meteor': 0.5154173493739934} |
76
+ | 0.0039 | 66.6667 | 100 | 0.0753 | {'meteor': 0.5147237489602302} |
77
+ | 0.003 | 70.0 | 105 | 0.0758 | {'meteor': 0.5141283140032965} |
78
+ | 0.0031 | 73.3333 | 110 | 0.0759 | {'meteor': 0.5147414640891067} |
79
+ | 0.0033 | 77.0 | 115 | 0.0759 | {'meteor': 0.5143698813094443} |
80
+ | 0.0026 | 80.0 | 120 | 0.0758 | {'meteor': 0.5133582948399001} |
81
+ | 0.0025 | 83.0 | 125 | 0.0758 | {'meteor': 0.5121935317618276} |
82
+ | 0.003 | 86.6667 | 130 | 0.0759 | {'meteor': 0.5121665250079868} |
83
+ | 0.0026 | 90.0 | 135 | 0.0759 | {'meteor': 0.5116741831961337} |
84
+ | 0.0026 | 93.3333 | 140 | 0.0759 | {'meteor': 0.512031945155839} |
85
+ | 0.003 | 97.0 | 145 | 0.0759 | {'meteor': 0.5120926382524592} |
86
+ | 0.0024 | 100.0 | 150 | 0.0759 | {'meteor': 0.5120753072373921} |
87
 
88
 
89
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:40f92f75861a07ab84fde169cd7a6859350315470490deca322a4bf4b237b7ae
3
  size 1576851440
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c7a479924302d23342c70961a85921db90ff2525c3d19a5d1f5069a2fae5ddd
3
  size 1576851440
runs/Jul15_12-45-47_OZPC/events.out.tfevents.1752605150.OZPC.19756.2 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:97b54bcc8be720913bd545a33bdfa6eca445b70950f6e12be8151b9048673c68
3
- size 19125
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b476661e27ff86709510980057584e4b0e22b2811bac1f37fabe2b3abd5b26fb
3
+ size 19961