whisper-tiny-cv-full-synthetic-pt
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset.
It achieves the following results on the evaluation set:
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
0.6982 |
0.2907 |
50 |
0.6187 |
0.5012 |
0.5814 |
100 |
0.5461 |
0.448 |
0.8721 |
150 |
0.5133 |
0.3321 |
1.1628 |
200 |
0.4977 |
0.3198 |
1.4535 |
250 |
0.4814 |
0.3069 |
1.7442 |
300 |
0.4700 |
0.2712 |
2.0349 |
350 |
0.4582 |
0.1954 |
2.3256 |
400 |
0.4599 |
0.2032 |
2.6163 |
450 |
0.4585 |
0.2011 |
2.9070 |
500 |
0.4517 |
0.1334 |
3.1977 |
550 |
0.4576 |
0.1309 |
3.4884 |
600 |
0.4596 |
0.13 |
3.7791 |
650 |
0.4613 |
0.1146 |
4.0698 |
700 |
0.4667 |
0.0906 |
4.3605 |
750 |
0.4702 |
0.0918 |
4.6512 |
800 |
0.4754 |
0.0955 |
4.9419 |
850 |
0.4741 |
0.0599 |
5.2326 |
900 |
0.4841 |
0.0647 |
5.5233 |
950 |
0.4877 |
0.0611 |
5.8140 |
1000 |
0.4953 |
0.0505 |
6.1047 |
1050 |
0.5025 |
0.042 |
6.3953 |
1100 |
0.5023 |
0.0448 |
6.6860 |
1150 |
0.5082 |
0.0452 |
6.9767 |
1200 |
0.5040 |
0.0294 |
7.2674 |
1250 |
0.5132 |
0.032 |
7.5581 |
1300 |
0.5164 |
0.0309 |
7.8488 |
1350 |
0.5183 |
0.0239 |
8.1395 |
1400 |
0.5241 |
0.0232 |
8.4302 |
1450 |
0.5250 |
0.0244 |
8.7209 |
1500 |
0.5256 |
0.024 |
9.0116 |
1550 |
0.5278 |
0.0207 |
9.3023 |
1600 |
0.5291 |
0.0217 |
9.5930 |
1650 |
0.5303 |
0.0203 |
9.8837 |
1700 |
0.5305 |
Framework versions
- Transformers 4.50.2
- Pytorch 2.5.1+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2