IoanaLivia/real_data_2_h_synth_standard_A_B_horoscope-whisper-small

This model is a fine-tuned version of openai/whisper-small on the IoanaLivia/real_data_2_h_synth_standard_A_B_horoscope dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2835
  • Wer: 18.5319

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0 0 0.4170 30.1014
0.2648 1.0 66 0.2766 18.9273
0.1007 2.0 132 0.2653 18.8413
0.047 3.0 198 0.2835 18.5319
0.0223 4.0 264 0.3050 18.7382
0.0111 5.0 330 0.3206 19.7696
0.0066 6.0 396 0.3394 19.4946
0.004 7.0 462 0.3476 18.5491
0.003 8.0 528 0.3550 18.5319
0.0027 9.0 594 0.3562 18.6007
0.0021 10.0 660 0.3619 18.6178
0.0019 11.0 726 0.3653 18.7038
0.0018 12.0 792 0.3678 18.7554
0.0016 13.0 858 0.3701 18.8069
0.0016 14.0 924 0.3715 18.8413
0.0015 15.0 990 0.3718 18.7554

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
38
Safetensors
Model size
242M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for IoanaLivia/real_data_2_h_synth_standard_A_B_horoscope-whisper-small

Finetuned
(2676)
this model

Dataset used to train IoanaLivia/real_data_2_h_synth_standard_A_B_horoscope-whisper-small

Evaluation results