zainulhakim's picture
End of training
4df0631 verified
|
raw
history blame
8.2 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-base-960h
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: 250119-centralized_learning
    results: []

250119-centralized_learning

This model is a fine-tuned version of facebook/wav2vec2-base-960h on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 7247.8286
  • Wer: 1.0
  • Cer: 0.9981

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3632.3885 1.0 2900 7247.8286 1.0 0.9981
3631.472 2.0 5800 7247.4536 1.0 0.9981
3634.3392 3.0 8700 7247.3838 1.0 0.9981
3624.8195 4.0 11600 7247.3589 1.0 0.9981
3628.096 5.0 14500 7247.3486 1.0 0.9981
3625.4922 6.0 17400 7247.3848 1.0 0.9981
3624.7733 7.0 20300 7247.3320 1.0 0.9981
3625.468 8.0 23200 7247.3823 1.0 0.9981
3624.3375 9.0 26100 7247.3701 1.0 0.9981
3625.2198 10.0 29000 7247.3511 1.0 0.9981
3624.0125 11.0 31900 7247.2944 1.0 0.9981
3624.4412 12.0 34800 7247.3662 1.0 0.9981
3624.0878 13.0 37700 7247.3164 1.0 0.9981
3623.9817 14.0 40600 7247.2944 1.0 0.9981
3624.541 15.0 43500 7247.5161 1.0 0.9981
3624.0802 16.0 46400 7247.5713 1.0 0.9981
3624.113 17.0 49300 7247.6650 1.0 0.9981
3714.1267 18.0 52200 nan 1.0 0.9981
0.0 19.0 55100 nan 1.0 0.9981
0.0 20.0 58000 nan 1.0 0.9981
0.0 21.0 60900 nan 1.0 0.9981
0.0 22.0 63800 nan 1.0 0.9981
0.0 23.0 66700 nan 1.0 0.9981
0.0 24.0 69600 nan 1.0 0.9981
0.0 25.0 72500 nan 1.0 0.9981
0.0 26.0 75400 nan 1.0 0.9981
0.0 27.0 78300 nan 1.0 0.9981
0.0 28.0 81200 nan 1.0 0.9981
0.0 29.0 84100 nan 1.0 0.9981
0.0 30.0 87000 nan 1.0 0.9981
0.0 31.0 89900 nan 1.0 0.9981
0.0 32.0 92800 nan 1.0 0.9981
0.0 33.0 95700 nan 1.0 0.9981
0.0 34.0 98600 nan 1.0 0.9981
0.0 35.0 101500 nan 1.0 0.9981
0.0 36.0 104400 nan 1.0 0.9981
0.0 37.0 107300 nan 1.0 0.9981
0.0 38.0 110200 nan 1.0 0.9981
0.0 39.0 113100 nan 1.0 0.9981
0.0 40.0 116000 nan 1.0 0.9981
0.0 41.0 118900 nan 1.0 0.9981
0.0 42.0 121800 nan 1.0 0.9981
0.0 43.0 124700 nan 1.0 0.9981
0.0 44.0 127600 nan 1.0 0.9981
0.0 45.0 130500 nan 1.0 0.9981
0.0 46.0 133400 nan 1.0 0.9981
0.0 47.0 136300 nan 1.0 0.9981
0.0 48.0 139200 nan 1.0 0.9981
0.0 49.0 142100 nan 1.0 0.9981
0.0 50.0 145000 nan 1.0 0.9981
0.0 51.0 147900 nan 1.0 0.9981
0.0 52.0 150800 nan 1.0 0.9981
0.0 53.0 153700 nan 1.0 0.9981
0.0 54.0 156600 nan 1.0 0.9981
0.0 55.0 159500 nan 1.0 0.9981
0.0 56.0 162400 nan 1.0 0.9981
0.0 57.0 165300 nan 1.0 0.9981
0.0 58.0 168200 nan 1.0 0.9981
0.0 59.0 171100 nan 1.0 0.9981
0.0 60.0 174000 nan 1.0 0.9981
0.0 61.0 176900 nan 1.0 0.9981
0.0 62.0 179800 nan 1.0 0.9981
0.0 63.0 182700 nan 1.0 0.9981
0.0 64.0 185600 nan 1.0 0.9981
0.0 65.0 188500 nan 1.0 0.9981
0.0 66.0 191400 nan 1.0 0.9981
0.0 67.0 194300 nan 1.0 0.9981
0.0 68.0 197200 nan 1.0 0.9981
0.0 69.0 200100 nan 1.0 0.9981
0.0 70.0 203000 nan 1.0 0.9981
0.0 71.0 205900 nan 1.0 0.9981
0.0 72.0 208800 nan 1.0 0.9981
0.0 73.0 211700 nan 1.0 0.9981
0.0 74.0 214600 nan 1.0 0.9981
0.0 75.0 217500 nan 1.0 0.9981
0.0 76.0 220400 nan 1.0 0.9981
0.0 77.0 223300 nan 1.0 0.9981
0.0 78.0 226200 nan 1.0 0.9981
0.0 79.0 229100 nan 1.0 0.9981
0.0 80.0 232000 nan 1.0 0.9981
0.0 81.0 234900 nan 1.0 0.9981
0.0 82.0 237800 nan 1.0 0.9981
0.0 83.0 240700 nan 1.0 0.9981
0.0 84.0 243600 nan 1.0 0.9981
0.0 85.0 246500 nan 1.0 0.9981
0.0 86.0 249400 nan 1.0 0.9981
0.0 87.0 252300 nan 1.0 0.9981
0.0 88.0 255200 nan 1.0 0.9981
0.0 89.0 258100 nan 1.0 0.9981
0.0 90.0 261000 nan 1.0 0.9981
0.0 91.0 263900 nan 1.0 0.9981
0.0 92.0 266800 nan 1.0 0.9981
0.0 93.0 269700 nan 1.0 0.9981
0.0 94.0 272600 nan 1.0 0.9981
0.0 95.0 275500 nan 1.0 0.9981
0.0 96.0 278400 nan 1.0 0.9981
0.0 97.0 281300 nan 1.0 0.9981
0.0 98.0 284200 nan 1.0 0.9981
0.0 99.0 287100 nan 1.0 0.9981
0.0 100.0 290000 nan 1.0 0.9981

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.19.1