IoanaLivia/real_data_1_h_synth_wavenet_A_horoscope-whisper-small

This model is a fine-tuned version of openai/whisper-small on the IoanaLivia/real_data_1_h_synth_wavenet_A_horoscope dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2701
  • Wer: 18.6350

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0 0 0.4170 30.1014
0.3223 1.0 46 0.2836 19.9416
0.132 2.0 92 0.2701 18.6350
0.0647 3.0 138 0.2860 18.9789
0.0323 4.0 184 0.3034 19.1336
0.0158 5.0 230 0.3200 19.1680
0.0093 6.0 276 0.3409 18.9789
0.0061 7.0 322 0.3488 18.8757
0.0043 8.0 368 0.3561 19.2367
0.0036 9.0 414 0.3605 18.9960
0.0031 10.0 460 0.3660 19.1680
0.0027 11.0 506 0.3688 19.3571
0.0025 12.0 552 0.3713 19.3055
0.0023 13.0 598 0.3731 19.2711
0.0022 14.0 644 0.3743 19.2883
0.0022 15.0 690 0.3747 19.3227

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
35
Safetensors
Model size
242M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for IoanaLivia/real_data_1_h_synth_wavenet_A_horoscope-whisper-small

Finetuned
(2676)
this model

Dataset used to train IoanaLivia/real_data_1_h_synth_wavenet_A_horoscope-whisper-small

Evaluation results