whisper-large-v3-turbo-icelandic-iceland-0.11-cer-filtered
This model is a fine-tuned version of openai/whisper-large-v3-turbo on the fleurs dataset. It achieves the following results on the evaluation set:
- Loss: 0.4170
- Model Preparation Time: 0.0065
- Wer Ortho: 25.8789
- Wer: 13.6496
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer Ortho | Wer |
---|---|---|---|---|---|---|
0.2952 | 0.1017 | 32 | 0.5322 | 0.0065 | 29.1992 | 18.2962 |
0.2454 | 0.2035 | 64 | 0.5165 | 0.0065 | 30.2734 | 19.7483 |
0.2157 | 0.3052 | 96 | 0.5138 | 0.0065 | 29.5898 | 18.4898 |
0.2006 | 0.4070 | 128 | 0.5135 | 0.0065 | 29.9805 | 18.2962 |
0.1753 | 0.5087 | 160 | 0.5199 | 0.0065 | 29.4922 | 18.0058 |
0.1728 | 0.6105 | 192 | 0.4628 | 0.0065 | 29.7852 | 18.1026 |
0.1718 | 0.7122 | 224 | 0.4562 | 0.0065 | 28.7109 | 16.6505 |
0.1631 | 0.8140 | 256 | 0.4364 | 0.0065 | 29.5898 | 18.2962 |
0.1614 | 0.9157 | 288 | 0.4461 | 0.0065 | 28.8086 | 16.9409 |
0.1295 | 1.0159 | 320 | 0.4348 | 0.0065 | 27.5391 | 15.3921 |
0.1131 | 1.1176 | 352 | 0.4490 | 0.0065 | 28.125 | 15.9729 |
0.1104 | 1.2194 | 384 | 0.4377 | 0.0065 | 27.6367 | 16.0697 |
0.1192 | 1.3211 | 416 | 0.4263 | 0.0065 | 27.2461 | 15.1985 |
0.1139 | 1.4229 | 448 | 0.4183 | 0.0065 | 27.5391 | 15.6825 |
0.1085 | 1.5246 | 480 | 0.4321 | 0.0065 | 28.2227 | 16.1665 |
0.1122 | 1.6264 | 512 | 0.4482 | 0.0065 | 27.0508 | 15.0048 |
0.1092 | 1.7281 | 544 | 0.4115 | 0.0065 | 26.9531 | 14.9080 |
0.1091 | 1.8299 | 576 | 0.4246 | 0.0065 | 26.6602 | 14.8112 |
0.1065 | 1.9316 | 608 | 0.4372 | 0.0065 | 27.7344 | 16.2633 |
0.0785 | 2.0318 | 640 | 0.4554 | 0.0065 | 26.9531 | 14.8112 |
0.0768 | 2.1335 | 672 | 0.4434 | 0.0065 | 26.3672 | 14.3272 |
0.0749 | 2.2353 | 704 | 0.4343 | 0.0065 | 27.8320 | 15.7793 |
0.0777 | 2.3370 | 736 | 0.4170 | 0.0065 | 25.8789 | 13.6496 |
0.0762 | 2.4388 | 768 | 0.4372 | 0.0065 | 26.8555 | 14.3272 |
0.0756 | 2.5405 | 800 | 0.4403 | 0.0065 | 26.4648 | 14.4240 |
0.0785 | 2.6423 | 832 | 0.4276 | 0.0065 | 27.0508 | 14.6176 |
0.0734 | 2.7440 | 864 | 0.4328 | 0.0065 | 25.9766 | 13.6496 |
0.0772 | 2.8458 | 896 | 0.4301 | 0.0065 | 26.2695 | 13.8432 |
0.0724 | 2.9475 | 928 | 0.4364 | 0.0065 | 25.9766 | 13.6496 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.5.1
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 30
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for SamuelPfisterer1/whisper-large-v3-turbo-icelandic-iceland-0.11-cer-filtered
Base model
openai/whisper-large-v3
Finetuned
openai/whisper-large-v3-turbo