whisper-large-v3-turbo-slovenian-slovenia-3-percent

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5694
  • Model Preparation Time: 0.0065
  • Wer Ortho: 31.9463
  • Wer: 12.9700

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.06
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Ortho Wer
0.1125 0.0929 50 0.6541 0.0065 36.1317 18.4317
0.0982 0.1857 100 0.5796 0.0065 36.2471 18.4559
0.0897 0.2786 150 0.5838 0.0065 35.0929 17.4447
0.0879 0.3714 200 0.5543 0.0065 35.0322 17.3236
0.0815 0.4643 250 0.5700 0.0065 34.1575 16.5849
0.0778 0.5571 300 0.5816 0.0065 34.1028 16.3912
0.0765 0.6500 350 0.5661 0.0065 34.4733 16.4335
0.0737 0.7428 400 0.5970 0.0065 33.9266 15.4102
0.0742 0.8357 450 0.6086 0.0065 33.6229 15.4163
0.0715 0.9285 500 0.5717 0.0065 33.3678 14.8774
0.0525 1.0204 550 0.5980 0.0065 33.1005 14.6715
0.0514 1.1133 600 0.5748 0.0065 32.7481 14.3566
0.0489 1.2061 650 0.6079 0.0065 33.2949 14.6352
0.0502 1.2990 700 0.6025 0.0065 33.4224 14.7139
0.0487 1.3918 750 0.5779 0.0065 32.5902 13.9570
0.0503 1.4847 800 0.5803 0.0065 32.5598 13.9873
0.0504 1.5775 850 0.5921 0.0065 32.2926 13.8965
0.0492 1.6704 900 0.5524 0.0065 32.2622 13.6361
0.047 1.7632 950 0.5789 0.0065 32.5598 13.8662
0.0486 1.8561 1000 0.5798 0.0065 32.4930 13.9328
0.0491 1.9489 1050 0.5700 0.0065 32.5841 13.8359
0.034 2.0409 1100 0.6116 0.0065 32.4566 13.7390
0.0365 2.1337 1150 0.5827 0.0065 32.4930 13.6119
0.0353 2.2266 1200 0.5735 0.0065 32.2440 13.3394
0.0338 2.3194 1250 0.5869 0.0065 32.2683 13.3878
0.0353 2.4123 1300 0.5642 0.0065 32.1589 13.3636
0.0342 2.5051 1350 0.5871 0.0065 32.0131 13.2062
0.0347 2.5980 1400 0.5866 0.0065 32.1285 13.2365
0.0341 2.6908 1450 0.5689 0.0065 32.0678 13.2425
0.0357 2.7837 1500 0.5731 0.0065 31.9949 13.0911
0.0336 2.8765 1550 0.5742 0.0065 32.1650 13.1759
0.0351 2.9694 1600 0.5694 0.0065 31.9463 12.9700

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.5.1
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
6
Safetensors
Model size
809M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for SamuelPfisterer1/whisper-large-v3-turbo-slovenian-slovenia-3-percent

Finetuned
(261)
this model

Evaluation results