IoanaLivia/real_data_1_h_synth_standard_B_horoscope-whisper-small

This model is a fine-tuned version of openai/whisper-small on the IoanaLivia/real_data_1_h_synth_standard_B_horoscope dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2705
  • Wer: 18.4459

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0 0 0.4170 30.1014
0.3332 1.0 42 0.2841 19.8556
0.1356 2.0 84 0.2705 18.4459
0.0678 3.0 126 0.2814 18.9789
0.0331 4.0 168 0.3037 20.0791
0.0162 5.0 210 0.3249 19.4774
0.0098 6.0 252 0.3424 19.2711
0.0065 7.0 294 0.3458 18.8413
0.005 8.0 336 0.3524 19.0992
0.0039 9.0 378 0.3605 19.0992
0.0034 10.0 420 0.3654 18.8929
0.003 11.0 462 0.3680 18.9445
0.0028 12.0 504 0.3708 18.8929
0.0026 13.0 546 0.3729 19.0132
0.0025 14.0 588 0.3737 19.0304
0.0025 14.6506 615 0.3738 19.0132

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
29
Safetensors
Model size
242M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for IoanaLivia/real_data_1_h_synth_standard_B_horoscope-whisper-small

Finetuned
(2673)
this model

Dataset used to train IoanaLivia/real_data_1_h_synth_standard_B_horoscope-whisper-small

Evaluation results