|
--- |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: wav2vec2-large-xls-r-300m-korean-d |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# wav2vec2-large-xls-r-300m-korean-d |
|
|
|
This model was trained from scratch on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 5.0644 |
|
- Cer: 0.8255 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 4 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 100 |
|
- num_epochs: 40 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Cer | |
|
|:-------------:|:-----:|:----:|:---------------:|:------:| |
|
| 4.1174 | 0.66 | 50 | 5.1872 | 0.9986 | |
|
| 4.0452 | 1.32 | 100 | 5.1870 | 0.9986 | |
|
| 4.0499 | 1.97 | 150 | 5.2289 | 0.9986 | |
|
| 4.0371 | 2.63 | 200 | 5.1608 | 0.9986 | |
|
| 3.9664 | 3.29 | 250 | 5.1345 | 0.9977 | |
|
| 3.991 | 3.95 | 300 | 5.1517 | 0.9968 | |
|
| 3.9413 | 4.61 | 350 | 5.0673 | 0.9927 | |
|
| 3.9433 | 5.26 | 400 | 5.0650 | 0.9823 | |
|
| 3.8934 | 5.92 | 450 | 5.0518 | 0.9800 | |
|
| 3.8646 | 6.58 | 500 | 5.0400 | 0.9823 | |
|
| 3.8491 | 7.24 | 550 | 5.1012 | 0.9764 | |
|
| 3.8725 | 7.89 | 600 | 5.0649 | 0.9855 | |
|
| 3.7272 | 8.55 | 650 | 5.1139 | 0.9791 | |
|
| 3.8121 | 9.21 | 700 | 5.0366 | 0.9409 | |
|
| 3.7743 | 9.87 | 750 | 5.0990 | 0.9673 | |
|
| 3.7207 | 10.53 | 800 | 5.0603 | 0.9278 | |
|
| 3.7116 | 11.18 | 850 | 5.0920 | 0.9119 | |
|
| 3.7163 | 11.84 | 900 | 5.0840 | 0.8996 | |
|
| 3.657 | 12.5 | 950 | 5.0855 | 0.8928 | |
|
| 3.6476 | 13.16 | 1000 | 5.0409 | 0.8851 | |
|
| 3.645 | 13.82 | 1050 | 5.0704 | 0.9028 | |
|
| 3.5882 | 14.47 | 1100 | 5.0391 | 0.8610 | |
|
| 3.5773 | 15.13 | 1150 | 5.0805 | 0.8628 | |
|
| 3.5681 | 15.79 | 1200 | 5.1300 | 0.8769 | |
|
| 3.5611 | 16.45 | 1250 | 5.0740 | 0.8760 | |
|
| 3.5221 | 17.11 | 1300 | 5.0698 | 0.8669 | |
|
| 3.493 | 17.76 | 1350 | 5.0618 | 0.8455 | |
|
| 3.5117 | 18.42 | 1400 | 5.0372 | 0.8433 | |
|
| 3.4777 | 19.08 | 1450 | 5.0964 | 0.8642 | |
|
| 3.4632 | 19.74 | 1500 | 5.0928 | 0.8623 | |
|
| 3.4496 | 20.39 | 1550 | 5.1118 | 0.8710 | |
|
| 3.4674 | 21.05 | 1600 | 5.0703 | 0.8392 | |
|
| 3.431 | 21.71 | 1650 | 5.0514 | 0.8373 | |
|
| 3.4115 | 22.37 | 1700 | 5.0611 | 0.8355 | |
|
| 3.3808 | 23.03 | 1750 | 5.1055 | 0.8537 | |
|
| 3.4101 | 23.68 | 1800 | 5.0532 | 0.8296 | |
|
| 3.3852 | 24.34 | 1850 | 5.0646 | 0.8310 | |
|
| 3.3533 | 25.0 | 1900 | 5.0684 | 0.8387 | |
|
| 3.3591 | 25.66 | 1950 | 5.0581 | 0.8364 | |
|
| 3.3437 | 26.32 | 2000 | 5.0565 | 0.8314 | |
|
| 3.369 | 26.97 | 2050 | 5.0577 | 0.8364 | |
|
| 3.3606 | 27.63 | 2100 | 5.0515 | 0.8237 | |
|
| 3.3163 | 28.29 | 2150 | 5.0533 | 0.8278 | |
|
| 3.3149 | 28.95 | 2200 | 5.0682 | 0.8292 | |
|
| 3.3535 | 29.61 | 2250 | 5.0554 | 0.8274 | |
|
| 3.2695 | 30.26 | 2300 | 5.0610 | 0.8242 | |
|
| 3.2947 | 30.92 | 2350 | 5.0658 | 0.8255 | |
|
| 3.3323 | 31.58 | 2400 | 5.0644 | 0.8255 | |
|
| 3.2913 | 32.24 | 2450 | 5.0644 | 0.8255 | |
|
| 3.3169 | 32.89 | 2500 | 5.0644 | 0.8255 | |
|
| 3.3147 | 33.55 | 2550 | 5.0644 | 0.8255 | |
|
| 3.3059 | 34.21 | 2600 | 5.0644 | 0.8255 | |
|
| 3.3311 | 34.87 | 2650 | 5.0644 | 0.8255 | |
|
| 3.286 | 35.53 | 2700 | 5.0644 | 0.8255 | |
|
| 3.3842 | 36.18 | 2750 | 5.0644 | 0.8255 | |
|
| 3.303 | 36.84 | 2800 | 5.0644 | 0.8255 | |
|
| 3.2833 | 37.5 | 2850 | 5.0644 | 0.8255 | |
|
| 3.3036 | 38.16 | 2900 | 5.0644 | 0.8255 | |
|
| 3.3149 | 38.82 | 2950 | 5.0644 | 0.8255 | |
|
| 3.2784 | 39.47 | 3000 | 5.0644 | 0.8255 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.30.2 |
|
- Pytorch 2.0.1+cu118 |
|
- Datasets 2.13.1 |
|
- Tokenizers 0.13.3 |
|
|