ooliverz commited on
Commit
4813376
·
verified ·
1 Parent(s): cda1b86

End of training

Browse files
README.md CHANGED
@@ -1,11 +1,11 @@
1
  ---
2
- base_model: microsoft/git-large-r-coco
3
- datasets:
4
- - imagefolder
5
  library_name: transformers
6
  license: mit
 
7
  tags:
8
  - generated_from_trainer
 
 
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-r
11
  results: []
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.0759
22
- - Meteor Score: {'meteor': 0.5120753072373921}
23
 
24
  ## Model description
25
 
@@ -47,43 +47,33 @@ The following hyperparameters were used during training:
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 5
50
- - num_epochs: 150
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
- |:-------------:|:-------:|:----:|:---------------:|:------------------------------:|
57
- | 0.2014 | 3.0 | 5 | 0.0722 | {'meteor': 0.5065157314467645} |
58
- | 0.2196 | 6.6667 | 10 | 0.0627 | {'meteor': 0.5305128358337737} |
59
- | 0.1292 | 10.0 | 15 | 0.0594 | {'meteor': 0.5263979985260087} |
60
- | 0.0872 | 13.3333 | 20 | 0.0605 | {'meteor': 0.5108520041090149} |
61
- | 0.0818 | 17.0 | 25 | 0.0582 | {'meteor': 0.5113838555978679} |
62
- | 0.0531 | 20.0 | 30 | 0.0621 | {'meteor': 0.5103098091443673} |
63
- | 0.0443 | 23.0 | 35 | 0.0632 | {'meteor': 0.5018489405290812} |
64
- | 0.0489 | 26.6667 | 40 | 0.0639 | {'meteor': 0.5119666957218931} |
65
- | 0.0315 | 30.0 | 45 | 0.0648 | {'meteor': 0.5128484629162279} |
66
- | 0.0245 | 33.3333 | 50 | 0.0674 | {'meteor': 0.5114053511060893} |
67
- | 0.0213 | 37.0 | 55 | 0.0689 | {'meteor': 0.5103811878007981} |
68
- | 0.0108 | 40.0 | 60 | 0.0704 | {'meteor': 0.5080035696805529} |
69
- | 0.0089 | 43.0 | 65 | 0.0712 | {'meteor': 0.5235500043256491} |
70
- | 0.0088 | 46.6667 | 70 | 0.0730 | {'meteor': 0.5162015377767581} |
71
- | 0.0079 | 50.0 | 75 | 0.0704 | {'meteor': 0.508295723546233} |
72
- | 0.0055 | 53.3333 | 80 | 0.0727 | {'meteor': 0.5093121829271419} |
73
- | 0.0049 | 57.0 | 85 | 0.0739 | {'meteor': 0.5119514909960822} |
74
- | 0.0033 | 60.0 | 90 | 0.0749 | {'meteor': 0.5106012588241947} |
75
- | 0.0033 | 63.0 | 95 | 0.0751 | {'meteor': 0.5154173493739934} |
76
- | 0.0039 | 66.6667 | 100 | 0.0753 | {'meteor': 0.5147237489602302} |
77
- | 0.003 | 70.0 | 105 | 0.0758 | {'meteor': 0.5141283140032965} |
78
- | 0.0031 | 73.3333 | 110 | 0.0759 | {'meteor': 0.5147414640891067} |
79
- | 0.0033 | 77.0 | 115 | 0.0759 | {'meteor': 0.5143698813094443} |
80
- | 0.0026 | 80.0 | 120 | 0.0758 | {'meteor': 0.5133582948399001} |
81
- | 0.0025 | 83.0 | 125 | 0.0758 | {'meteor': 0.5121935317618276} |
82
- | 0.003 | 86.6667 | 130 | 0.0759 | {'meteor': 0.5121665250079868} |
83
- | 0.0026 | 90.0 | 135 | 0.0759 | {'meteor': 0.5116741831961337} |
84
- | 0.0026 | 93.3333 | 140 | 0.0759 | {'meteor': 0.512031945155839} |
85
- | 0.003 | 97.0 | 145 | 0.0759 | {'meteor': 0.5120926382524592} |
86
- | 0.0024 | 100.0 | 150 | 0.0759 | {'meteor': 0.5120753072373921} |
87
 
88
 
89
  ### Framework versions
 
1
  ---
 
 
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: microsoft/git-large-r-coco
5
  tags:
6
  - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-r
11
  results: []
 
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 2.8325
22
+ - Meteor Score: {'meteor': 0.44716808145000575}
23
 
24
  ## Model description
25
 
 
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 5
50
+ - num_epochs: 100
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
+ |:-------------:|:-------:|:----:|:---------------:|:-------------------------------:|
57
+ | 82.9228 | 3.0 | 5 | 10.5760 | {'meteor': 0.05763852785320578} |
58
+ | 89.2818 | 6.6667 | 10 | 9.2973 | {'meteor': 0.0566230094484481} |
59
+ | 71.0589 | 10.0 | 15 | 7.9880 | {'meteor': 0.06313383286499662} |
60
+ | 62.1814 | 13.3333 | 20 | 7.1226 | {'meteor': 0.11846920353982553} |
61
+ | 61.4373 | 17.0 | 25 | 6.4914 | {'meteor': 0.13491195786509014} |
62
+ | 45.9064 | 20.0 | 30 | 5.9411 | {'meteor': 0.16754599930031555} |
63
+ | 42.5802 | 23.0 | 35 | 5.4427 | {'meteor': 0.1915275196011832} |
64
+ | 47.7153 | 26.6667 | 40 | 4.9903 | {'meteor': 0.32374710181123983} |
65
+ | 39.6858 | 30.0 | 45 | 4.5799 | {'meteor': 0.3415089427560285} |
66
+ | 36.5271 | 33.3333 | 50 | 4.2147 | {'meteor': 0.37617485564825703} |
67
+ | 36.9405 | 37.0 | 55 | 3.8945 | {'meteor': 0.3963594699614644} |
68
+ | 28.0545 | 40.0 | 60 | 3.6215 | {'meteor': 0.40938786947486} |
69
+ | 26.4561 | 43.0 | 65 | 3.3945 | {'meteor': 0.42123476340997434} |
70
+ | 30.432 | 46.6667 | 70 | 3.2123 | {'meteor': 0.4147437564703583} |
71
+ | 26.2868 | 50.0 | 75 | 3.0723 | {'meteor': 0.4409017964342259} |
72
+ | 25.2682 | 53.3333 | 80 | 2.9708 | {'meteor': 0.4466618012682805} |
73
+ | 26.9821 | 57.0 | 85 | 2.9025 | {'meteor': 0.44590994086182606} |
74
+ | 21.6659 | 60.0 | 90 | 2.8609 | {'meteor': 0.4467688108057412} |
75
+ | 21.4648 | 63.0 | 95 | 2.8398 | {'meteor': 0.44716808145000575} |
76
+ | 26.1056 | 66.6667 | 100 | 2.8325 | {'meteor': 0.44716808145000575} |
 
 
 
 
 
 
 
 
 
 
77
 
78
 
79
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d2f17ddad9b5133feee2b6646a19d7a0562ecca676345dd42612fa96ad726c3f
3
  size 1576851440
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6c41cebafa85c94d2581fb906c04f719f4a458febbb4068770727b9831d7cb56
3
  size 1576851440
runs/Jul16_07-06-21_OZPC/events.out.tfevents.1752671189.OZPC.19412.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2abde4dad482ea76b80cb0502af939319e2b9836f6f99a2ddb20b61b8e8dd181
3
- size 14399
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aab28107b535b4b54340080e489a2b80eb94371d592f9428476ca182344e8251
3
+ size 15220