ooliverz commited on
Commit
c4f7be3
·
verified ·
1 Parent(s): 9cf316e

End of training

Browse files
README.md CHANGED
@@ -1,11 +1,11 @@
1
  ---
2
- base_model: microsoft/git-large-r-coco
3
- datasets:
4
- - imagefolder
5
  library_name: transformers
6
  license: mit
 
7
  tags:
8
  - generated_from_trainer
 
 
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-rv2
11
  results: []
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.1043
22
- - Meteor Score: {'meteor': 0.6687313153455023}
23
 
24
  ## Model description
25
 
@@ -45,7 +45,7 @@ The following hyperparameters were used during training:
45
  - gradient_accumulation_steps: 16
46
  - total_train_batch_size: 96
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
- - lr_scheduler_type: cosine
49
  - lr_scheduler_warmup_steps: 50
50
  - num_epochs: 20
51
  - mixed_precision_training: Native AMP
@@ -54,10 +54,10 @@ The following hyperparameters were used during training:
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
  |:-------------:|:-----:|:----:|:---------------:|:------------------------------:|
57
- | 1.0495 | 5.0 | 5 | 0.1148 | {'meteor': 0.6689391701123871} |
58
- | 1.0158 | 10.0 | 10 | 0.1133 | {'meteor': 0.6657644639932019} |
59
- | 0.9478 | 15.0 | 15 | 0.1081 | {'meteor': 0.6626441748430653} |
60
- | 0.8514 | 20.0 | 20 | 0.1043 | {'meteor': 0.6687313153455023} |
61
 
62
 
63
  ### Framework versions
 
1
  ---
 
 
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: microsoft/git-large-r-coco
5
  tags:
6
  - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
  model-index:
10
  - name: git-large-r-coco-IDB_ADv1_COCOv6-rv2
11
  results: []
 
18
 
19
  This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.0934
22
+ - Meteor Score: {'meteor': 0.6849337864778346}
23
 
24
  ## Model description
25
 
 
45
  - gradient_accumulation_steps: 16
46
  - total_train_batch_size: 96
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
+ - lr_scheduler_type: cosine_with_restarts
49
  - lr_scheduler_warmup_steps: 50
50
  - num_epochs: 20
51
  - mixed_precision_training: Native AMP
 
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Meteor Score |
56
  |:-------------:|:-----:|:----:|:---------------:|:------------------------------:|
57
+ | 0.7802 | 5.0 | 5 | 0.1019 | {'meteor': 0.6667354024231188} |
58
+ | 0.756 | 10.0 | 10 | 0.1006 | {'meteor': 0.6766394690979587} |
59
+ | 0.7057 | 15.0 | 15 | 0.0972 | {'meteor': 0.6774630113691023} |
60
+ | 0.6349 | 20.0 | 20 | 0.0934 | {'meteor': 0.6849337864778346} |
61
 
62
 
63
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e78f12ab90e2137d0d6afe0f02ae7c5ec7caea23885812985b47fe1defeb96cf
3
  size 1576851440
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2fc8087ade0e5b9b4af3cb1d0ad4b7e4dc021bf21021928ea5decdacdfc72ef
3
  size 1576851440
runs/Jul18_19-16-49_OZPC/events.out.tfevents.1752887810.OZPC.5052.7 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5187df82036d07a917e5c59bb7375113583a1871ad089e5736c09083e0375ebd
3
- size 6808
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:15093d7997527a160717bffe7d5985f07e9b5041ee710e4db62e328b2d167368
3
+ size 7629