whisper-largev3-egy-v1
This model is a fine-tuned version of openai/whisper-large-v3 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5546
- Wer: 82.4230
- Cer: 64.5239
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 2000
- num_epochs: 15
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|---|---|---|---|---|---|
| 0.2956 | 0.3742 | 1500 | 0.3018 | 63.6348 | 47.5770 |
| 0.2867 | 0.7483 | 3000 | 0.2841 | 93.2416 | 79.6830 |
| 0.213 | 1.1225 | 4500 | 0.2759 | 97.8988 | 85.8519 |
| 0.227 | 1.4966 | 6000 | 0.2677 | 91.6266 | 74.8936 |
| 0.2212 | 1.8708 | 7500 | 0.2617 | 93.4321 | 79.6175 |
| 0.1611 | 2.2449 | 9000 | 0.2667 | 96.8099 | 78.9492 |
| 0.1589 | 2.6191 | 10500 | 0.2643 | 100.5175 | 76.5414 |
| 0.1697 | 2.9933 | 12000 | 0.2622 | 84.5441 | 67.4836 |
| 0.1071 | 3.3674 | 13500 | 0.2855 | 96.4829 | 81.3859 |
| 0.1136 | 3.7416 | 15000 | 0.2814 | 99.5195 | 75.9088 |
| 0.0637 | 4.1157 | 16500 | 0.3122 | 83.2163 | 64.5333 |
| 0.0692 | 4.4899 | 18000 | 0.3137 | 90.7196 | 76.6609 |
| 0.0687 | 4.8641 | 19500 | 0.3179 | 91.4560 | 75.3464 |
| 0.0333 | 5.2382 | 21000 | 0.3506 | 90.0088 | 71.7452 |
| 0.0366 | 5.6124 | 22500 | 0.3571 | 86.4889 | 69.5759 |
| 0.0379 | 5.9865 | 24000 | 0.3554 | 88.3228 | 63.6308 |
| 0.0174 | 6.3607 | 25500 | 0.3917 | 91.0494 | 75.0314 |
| 0.0176 | 6.7348 | 27000 | 0.3937 | 84.2740 | 64.2078 |
| 0.0075 | 7.1090 | 28500 | 0.4261 | 83.2419 | 65.5333 |
| 0.0092 | 7.4832 | 30000 | 0.4228 | 89.9832 | 69.3133 |
| 0.0096 | 7.8573 | 31500 | 0.4328 | 86.2415 | 71.8841 |
| 0.0042 | 8.2315 | 33000 | 0.4482 | 83.1168 | 64.2068 |
| 0.0045 | 8.6056 | 34500 | 0.4502 | 86.0169 | 67.4873 |
| 0.005 | 8.9798 | 36000 | 0.4580 | 83.4096 | 65.3331 |
| 0.0027 | 9.3540 | 37500 | 0.4667 | 82.2837 | 60.9284 |
| 0.0027 | 9.7281 | 39000 | 0.4718 | 84.5469 | 63.7325 |
| 0.0012 | 10.1023 | 40500 | 0.4819 | 80.8678 | 62.0516 |
| 0.0012 | 10.4764 | 42000 | 0.4911 | 81.6951 | 59.3660 |
| 0.0013 | 10.8506 | 43500 | 0.4950 | 78.6415 | 60.5107 |
| 0.0005 | 11.2247 | 45000 | 0.5076 | 81.8117 | 62.4285 |
| 0.0005 | 11.5989 | 46500 | 0.5055 | 78.8746 | 59.2439 |
| 0.0008 | 11.9731 | 48000 | 0.5101 | 79.2244 | 59.3010 |
| 0.0006 | 12.3472 | 49500 | 0.5234 | 81.2459 | 63.5160 |
| 0.0002 | 12.7214 | 51000 | 0.5310 | 83.3272 | 65.2183 |
| 0.0001 | 13.0955 | 52500 | 0.5380 | 83.5347 | 65.1864 |
| 0.0001 | 13.4697 | 54000 | 0.5450 | 81.4990 | 63.0972 |
| 0.0001 | 13.8439 | 55500 | 0.5491 | 82.4230 | 64.9317 |
| 0.0001 | 14.2180 | 57000 | 0.5523 | 82.3889 | 64.7414 |
| 0.0001 | 14.5922 | 58500 | 0.5544 | 82.5510 | 64.7718 |
| 0.0001 | 14.9663 | 60000 | 0.5546 | 82.4230 | 64.5239 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.1+cu128
- Datasets 3.6.0
- Tokenizers 0.21.4
- Downloads last month
- 56
Model tree for samil24/whisper-largev3-egy-v1
Base model
openai/whisper-large-v3