ooliverz commited on
Commit
cbe3c92
·
verified ·
1 Parent(s): 9c6a218

End of training

Browse files
README.md CHANGED
@@ -1,11 +1,11 @@
1
  ---
2
- base_model: microsoft/git-large-r-coco
3
- datasets:
4
- - imagefolder
5
  library_name: transformers
6
  license: mit
 
7
  tags:
8
  - generated_from_trainer
 
 
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-rv2
11
  results: []
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 1.2960
22
- - Meteor Score: {'meteor': 0.6500893825313575}
23
 
24
  ## Model description
25
 
@@ -47,31 +47,17 @@ The following hyperparameters were used during training:
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 50
50
- - num_epochs: 90
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
  |:-------------:|:-----:|:----:|:---------------:|:------------------------------:|
57
- | 66.7134 | 5.0 | 5 | 4.0753 | {'meteor': 0.5993281464976976} |
58
- | 66.485 | 10.0 | 10 | 4.0433 | {'meteor': 0.6089663843576959} |
59
- | 65.6963 | 15.0 | 15 | 3.9786 | {'meteor': 0.6095333838712824} |
60
- | 64.3776 | 20.0 | 20 | 3.8813 | {'meteor': 0.6134190979436022} |
61
- | 62.5373 | 25.0 | 25 | 3.7532 | {'meteor': 0.6085557200756545} |
62
- | 60.1653 | 30.0 | 30 | 3.5894 | {'meteor': 0.6101478691549013} |
63
- | 57.4769 | 35.0 | 35 | 3.3996 | {'meteor': 0.6216777032100391} |
64
- | 54.0515 | 40.0 | 40 | 3.1708 | {'meteor': 0.6117253911082575} |
65
- | 50.0954 | 45.0 | 45 | 2.9117 | {'meteor': 0.6318492684705416} |
66
- | 45.6823 | 50.0 | 50 | 2.6244 | {'meteor': 0.628492202016849} |
67
- | 40.8434 | 55.0 | 55 | 2.3222 | {'meteor': 0.6366743166997665} |
68
- | 35.9606 | 60.0 | 60 | 2.0199 | {'meteor': 0.6428147388556926} |
69
- | 31.3121 | 65.0 | 65 | 1.7651 | {'meteor': 0.6421314637392068} |
70
- | 27.5112 | 70.0 | 70 | 1.5673 | {'meteor': 0.6483513899407666} |
71
- | 24.653 | 75.0 | 75 | 1.4289 | {'meteor': 0.647761821003933} |
72
- | 22.7509 | 80.0 | 80 | 1.3469 | {'meteor': 0.6445735067208078} |
73
- | 21.6869 | 85.0 | 85 | 1.3067 | {'meteor': 0.651431012014952} |
74
- | 21.2456 | 90.0 | 90 | 1.2960 | {'meteor': 0.6500893825313575} |
75
 
76
 
77
  ### Framework versions
 
1
  ---
 
 
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: microsoft/git-large-r-coco
5
  tags:
6
  - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-rv2
11
  results: []
 
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.1043
22
+ - Meteor Score: {'meteor': 0.6687313153455023}
23
 
24
  ## Model description
25
 
 
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 50
50
+ - num_epochs: 20
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
  |:-------------:|:-----:|:----:|:---------------:|:------------------------------:|
57
+ | 1.0495 | 5.0 | 5 | 0.1148 | {'meteor': 0.6689391701123871} |
58
+ | 1.0158 | 10.0 | 10 | 0.1133 | {'meteor': 0.6657644639932019} |
59
+ | 0.9478 | 15.0 | 15 | 0.1081 | {'meteor': 0.6626441748430653} |
60
+ | 0.8514 | 20.0 | 20 | 0.1043 | {'meteor': 0.6687313153455023} |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
 
62
 
63
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3b9a31e8ca56a34f73944216f08391349b43f240f7d589d68eeceefef6f3b6b2
3
  size 1576851440
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cdb61ed35730550b6656bb7b8475d39af040d1a3828b250a3d9dd6abf9c8c949
3
  size 1576851440
runs/Jul18_19-12-11_OZPC/events.out.tfevents.1752887535.OZPC.5052.6 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f5113c013c0f4be6da6a7f6a3079a9eddc4134613e48e588da840d3e93af1391
3
- size 6794
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bc0118580cab705e16e53fb2b63e882c1029e3b933f37a6bc28bafdee49d020d
3
+ size 7615