whisper-tiny-kor_eng_tiny_ps_ob

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7169
  • Cer: 12.5898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 12
  • eval_batch_size: 6
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
1.1215 0.4167 100 0.9467 26.4777
0.8211 0.8333 200 0.7182 22.5991
0.5379 1.25 300 0.6071 15.1115
0.4346 1.6667 400 0.5519 11.8711
0.3403 2.0833 500 0.5439 12.8501
0.1993 2.5 600 0.5605 14.4827
0.1992 2.9167 700 0.5525 12.3203
0.1056 3.3333 800 0.5637 11.2175
0.0831 3.75 900 0.5797 19.1976
0.0666 4.1667 1000 0.6070 11.7379
0.0465 4.5833 1100 0.6067 14.9133
0.0453 5.0 1200 0.6096 12.7974
0.0221 5.4167 1300 0.6307 14.0613
0.0262 5.8333 1400 0.6271 11.9052
0.017 6.25 1500 0.6371 11.3259
0.0124 6.6667 1600 0.6460 12.4938
0.0126 7.0833 1700 0.6564 10.8922
0.0091 7.5 1800 0.6503 10.9139
0.007 7.9167 1900 0.6666 12.0973
0.0077 8.3333 2000 0.6709 10.9696
0.0046 8.75 2100 0.6740 10.7404
0.0049 9.1667 2200 0.6936 11.4157
0.0035 9.5833 2300 0.6826 11.6109
0.0025 10.0 2400 0.7020 11.9300
0.0028 10.4167 2500 0.7051 15.3779
0.0028 10.8333 2600 0.6954 12.4938
0.0023 11.25 2700 0.6995 11.4870
0.0013 11.6667 2800 0.7008 12.8160
0.0011 12.0833 2900 0.7056 12.8625
0.0012 12.5 3000 0.7057 13.0235
0.0024 12.9167 3100 0.7057 12.8903
0.001 13.3333 3200 0.7100 12.9833
0.0008 13.75 3300 0.7135 11.3910
0.0014 14.1667 3400 0.7108 12.8222
0.0007 14.5833 3500 0.7127 12.7757
0.0006 15.0 3600 0.7142 12.8470
0.0006 15.4167 3700 0.7152 12.9244
0.0007 15.8333 3800 0.7158 12.6983
0.0006 16.25 3900 0.7167 12.5991
0.0006 16.6667 4000 0.7169 12.5898

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu126
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
5
Safetensors
Model size
37.8M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support