Moroccan-Darija-STT-tiny-v1.6.2

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5041
  • Wer: 265.2778
  • Cer: 234.7702

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.75e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 7

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.3599 0.3067 50 0.5879 342.3025 321.1932
1.0078 0.6135 100 0.5332 429.5515 391.1339
0.9529 0.9202 150 0.4855 384.4712 317.1326
0.8157 1.2270 200 0.4915 299.5984 252.5396
0.7735 1.5337 250 0.4780 385.3999 338.2396
0.8126 1.8405 300 0.4780 504.2169 452.5717
0.6253 2.1472 350 0.4674 237.0231 189.8620
0.6763 2.4540 400 0.4699 354.5683 302.8715
0.6164 2.7607 450 0.4630 219.2687 174.6702
0.5001 3.0675 500 0.4820 240.1523 194.6878
0.5454 3.3742 550 0.4760 206.9110 164.0120
0.4911 3.6810 600 0.4800 277.3009 233.3919
0.4934 3.9877 650 0.4891 237.7761 190.4988
0.5278 4.2945 700 0.4890 213.9224 178.0535
0.5464 4.6012 750 0.4933 251.3136 217.1512
0.4388 4.9080 800 0.4905 266.0894 224.1339
0.4061 5.2147 850 0.4947 227.1252 191.6372
0.4557 5.5215 900 0.4980 213.8638 173.6314
0.4852 5.8282 950 0.4970 253.1376 217.9467
0.393 6.1350 1000 0.5037 235.5589 198.1504
0.3682 6.4417 1050 0.5042 234.1449 199.3294
0.3934 6.7485 1100 0.5041 265.2778 234.7702

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.21.0
Downloads last month
4
Safetensors
Model size
37.8M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for BounharAbdelaziz/Moroccan-Darija-STT-tiny-v1.6.2

Finetuned
(1305)
this model