IoanaLivia/real_data_6_h_synth_standard_A_horoscope-whisper-small

This model is a fine-tuned version of openai/whisper-small on the IoanaLivia/real_data_6_h_synth_standard_A_horoscope dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3101
  • Wer: 17.1910

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0 0 0.4170 30.1014
0.1573 1.0 151 0.2676 18.0162
0.0525 2.0 302 0.2681 17.3801
0.0226 3.0 453 0.2895 17.5176
0.0102 4.0 604 0.3101 17.1910
0.0053 5.0 755 0.3305 18.7382
0.0031 6.0 906 0.3422 19.6837
0.0022 7.0 1057 0.3547 18.6866
0.0015 8.0 1208 0.3584 18.2225
0.0011 9.0 1359 0.3690 18.5319
0.001 10.0 1510 0.3741 18.3772
0.0009 11.0 1661 0.3776 18.4459
0.0008 12.0 1812 0.3807 18.5319
0.0007 13.0 1963 0.3828 18.4631
0.0007 14.0 2114 0.3840 18.4459
0.0007 15.0 2265 0.3846 18.4459

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
39
Safetensors
Model size
242M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for IoanaLivia/real_data_6_h_synth_standard_A_horoscope-whisper-small

Finetuned
(2676)
this model

Dataset used to train IoanaLivia/real_data_6_h_synth_standard_A_horoscope-whisper-small

Evaluation results