whisper-large-v3-turbo-latvian-latvia-0.06-cer-filtered
This model is a fine-tuned version of openai/whisper-large-v3-turbo on the fleurs dataset. It achieves the following results on the evaluation set:
- Loss: 0.4949
- Model Preparation Time: 0.0064
- Wer Ortho: 35.0098
- Wer: 11.0794
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer Ortho | Wer |
---|---|---|---|---|---|---|
0.2464 | 0.0837 | 32 | 0.4843 | 0.0064 | 38.5584 | 17.2289 |
0.2141 | 0.1673 | 64 | 0.4760 | 0.0064 | 38.4149 | 16.7942 |
0.2035 | 0.2510 | 96 | 0.4884 | 0.0064 | 38.0561 | 16.1650 |
0.1864 | 0.3346 | 128 | 0.4361 | 0.0064 | 37.3190 | 15.5423 |
0.1742 | 0.4183 | 160 | 0.4530 | 0.0064 | 37.4364 | 15.2763 |
0.1534 | 0.5020 | 192 | 0.4413 | 0.0064 | 37.2342 | 14.4266 |
0.1608 | 0.5856 | 224 | 0.4777 | 0.0064 | 36.2427 | 14.0893 |
0.1655 | 0.6693 | 256 | 0.4263 | 0.0064 | 36.1057 | 13.6611 |
0.1555 | 0.7529 | 288 | 0.4399 | 0.0064 | 35.9817 | 13.3433 |
0.1507 | 0.8366 | 320 | 0.4227 | 0.0064 | 35.6882 | 13.0773 |
0.1495 | 0.9203 | 352 | 0.4559 | 0.0064 | 35.9295 | 13.0384 |
0.138 | 1.0026 | 384 | 0.4431 | 0.0064 | 35.6295 | 12.7530 |
0.1165 | 1.0863 | 416 | 0.4746 | 0.0064 | 35.9361 | 12.3832 |
0.1211 | 1.1699 | 448 | 0.4599 | 0.0064 | 35.6295 | 12.1757 |
0.1177 | 1.2536 | 480 | 0.4726 | 0.0064 | 35.8317 | 12.3703 |
0.1208 | 1.3373 | 512 | 0.4590 | 0.0064 | 35.7991 | 12.3508 |
0.118 | 1.4209 | 544 | 0.4564 | 0.0064 | 35.7078 | 12.2146 |
0.1179 | 1.5046 | 576 | 0.4346 | 0.0064 | 35.4338 | 11.8773 |
0.1206 | 1.5882 | 608 | 0.4512 | 0.0064 | 35.5447 | 11.7475 |
0.1167 | 1.6719 | 640 | 0.4653 | 0.0064 | 35.3164 | 11.6697 |
0.1065 | 1.7556 | 672 | 0.4559 | 0.0064 | 35.2120 | 11.5789 |
0.1026 | 1.8392 | 704 | 0.4734 | 0.0064 | 34.9772 | 11.5335 |
0.1128 | 1.9229 | 736 | 0.4723 | 0.0064 | 35.0294 | 11.5400 |
0.1142 | 2.0052 | 768 | 0.4784 | 0.0064 | 34.9772 | 11.5335 |
0.0941 | 2.0889 | 800 | 0.4937 | 0.0064 | 35.2577 | 11.4946 |
0.0874 | 2.1725 | 832 | 0.4896 | 0.0064 | 35.0424 | 11.4232 |
0.0842 | 2.2562 | 864 | 0.4933 | 0.0064 | 35.1076 | 11.3324 |
0.082 | 2.3399 | 896 | 0.4811 | 0.0064 | 34.6510 | 11.2156 |
0.0864 | 2.4235 | 928 | 0.4802 | 0.0064 | 34.5597 | 11.0924 |
0.0755 | 2.5072 | 960 | 0.4954 | 0.0064 | 34.8206 | 11.3389 |
0.0833 | 2.5908 | 992 | 0.4900 | 0.0064 | 34.9967 | 11.2026 |
0.0956 | 2.6745 | 1024 | 0.4928 | 0.0064 | 34.9511 | 11.1572 |
0.0801 | 2.7582 | 1056 | 0.4872 | 0.0064 | 34.7032 | 11.1313 |
0.0782 | 2.8418 | 1088 | 0.4949 | 0.0064 | 35.0098 | 11.0794 |
0.0851 | 2.9255 | 1120 | 0.4979 | 0.0064 | 34.9119 | 11.1508 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.5.1
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for SamuelPfisterer1/whisper-large-v3-turbo-latvian-latvia-0.06-cer-filtered
Base model
openai/whisper-large-v3
Finetuned
openai/whisper-large-v3-turbo