whisper-coastal-paiwan

This model is a fine-tuned version of openai/whisper-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9448
  • Wer: 37.2570

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.2293 3.8760 500 0.7556 49.3880
0.0754 7.7519 1000 0.8284 53.9957
0.0354 11.6279 1500 0.8634 44.2405
0.0305 15.5039 2000 0.9155 39.8128
0.014 19.3798 2500 0.9610 41.5407
0.0119 23.2558 3000 0.9340 38.0490
0.002 27.1318 3500 0.9220 37.9050
0.0026 31.0078 4000 0.9313 36.8611
0.0018 34.8837 4500 0.9397 36.9690
0.0013 38.7597 5000 0.9448 37.2570

Framework versions

  • Transformers 4.51.2
  • Pytorch 2.2.2+cu118
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
8
Safetensors
Model size
72.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for aslinguist/whisper-central-paiwan

Finetuned
(474)
this model