IAmSkyDra commited on
Commit
e4893d4
·
verified ·
1 Parent(s): 73fec28

End of training

Browse files
Files changed (1) hide show
  1. README.md +13 -18
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
- base_model: vinai/bartpho-syllable
3
  library_name: transformers
4
  license: mit
 
5
  tags:
6
  - generated_from_trainer
7
  model-index:
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [vinai/bartpho-syllable](https://huggingface.co/vinai/bartpho-syllable) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.3500
20
 
21
  ## Model description
22
 
@@ -41,28 +41,23 @@ The following hyperparameters were used during training:
41
  - seed: 42
42
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
43
  - lr_scheduler_type: linear
44
- - num_epochs: 15
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:-----:|:---------------:|
51
- | 4.2378 | 1.0 | 1316 | 3.1132 |
52
- | 2.7031 | 2.0 | 2632 | 2.3966 |
53
- | 2.3759 | 3.0 | 3948 | 2.0374 |
54
- | 2.0853 | 4.0 | 5264 | 1.8846 |
55
- | 1.923 | 5.0 | 6580 | 1.7449 |
56
- | 1.8195 | 6.0 | 7896 | 1.6978 |
57
- | 1.7291 | 7.0 | 9212 | 1.6243 |
58
- | 1.6472 | 8.0 | 10528 | 1.5380 |
59
- | 1.6053 | 9.0 | 11844 | 1.5235 |
60
- | 1.5479 | 10.0 | 13160 | 1.4200 |
61
- | 1.5175 | 11.0 | 14476 | 1.3930 |
62
- | 1.4861 | 12.0 | 15792 | 1.4136 |
63
- | 1.4566 | 13.0 | 17108 | 1.3706 |
64
- | 1.4263 | 14.0 | 18424 | 1.3472 |
65
- | 1.4147 | 15.0 | 19740 | 1.3436 |
66
 
67
 
68
  ### Framework versions
 
1
  ---
 
2
  library_name: transformers
3
  license: mit
4
+ base_model: vinai/bartpho-syllable
5
  tags:
6
  - generated_from_trainer
7
  model-index:
 
16
 
17
  This model is a fine-tuned version of [vinai/bartpho-syllable](https://huggingface.co/vinai/bartpho-syllable) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.5525
20
 
21
  ## Model description
22
 
 
41
  - seed: 42
42
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
43
  - lr_scheduler_type: linear
44
+ - num_epochs: 10
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:-----:|:-----:|:---------------:|
51
+ | 4.2284 | 1.0 | 1316 | 3.1527 |
52
+ | 2.6988 | 2.0 | 2632 | 2.3799 |
53
+ | 2.38 | 3.0 | 3948 | 2.0462 |
54
+ | 2.0879 | 4.0 | 5264 | 1.8895 |
55
+ | 1.9391 | 5.0 | 6580 | 1.7817 |
56
+ | 1.8423 | 6.0 | 7896 | 1.7480 |
57
+ | 1.7697 | 7.0 | 9212 | 1.6467 |
58
+ | 1.6992 | 8.0 | 10528 | 1.5939 |
59
+ | 1.6668 | 9.0 | 11844 | 1.5594 |
60
+ | 1.632 | 10.0 | 13160 | 1.5200 |
 
 
 
 
 
61
 
62
 
63
  ### Framework versions