mHuBERT-147
This model is a fine-tuned version of utter-project/mHuBERT-147 on the LEONEL-MAIA/FULFULDE-BALANCED - DEFAULT dataset. It achieves the following results on the evaluation set:
- Loss: 0.3128
- Wer: 0.5538
- Cer: 0.1521
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 60.0
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
4.9426 | 0.2249 | 500 | 4.7840 | 1.0 | 1.0 |
2.8932 | 0.4498 | 1000 | 2.8770 | 1.0 | 1.0 |
0.8514 | 0.6748 | 1500 | 0.6352 | 0.7061 | 0.1976 |
0.6178 | 0.8997 | 2000 | 0.4910 | 0.6470 | 0.1794 |
0.541 | 1.1246 | 2500 | 0.4469 | 0.6204 | 0.1703 |
0.487 | 1.3495 | 3000 | 0.4025 | 0.6087 | 0.1676 |
0.5479 | 1.5744 | 3500 | 0.3921 | 0.5957 | 0.1641 |
0.4922 | 1.7994 | 4000 | 0.3829 | 0.5888 | 0.1613 |
0.4452 | 2.0243 | 4500 | 0.3672 | 0.5826 | 0.1602 |
0.4606 | 2.2492 | 5000 | 0.3616 | 0.5793 | 0.1595 |
0.4586 | 2.4741 | 5500 | 0.3463 | 0.5740 | 0.1582 |
0.4599 | 2.6991 | 6000 | 0.3457 | 0.5711 | 0.1578 |
0.4259 | 2.9240 | 6500 | 0.3371 | 0.5677 | 0.1561 |
0.4016 | 3.1489 | 7000 | 0.3441 | 0.5624 | 0.1546 |
0.382 | 3.3738 | 7500 | 0.3402 | 0.5613 | 0.1543 |
0.4147 | 3.5987 | 8000 | 0.3331 | 0.5583 | 0.1529 |
0.3306 | 3.8237 | 8500 | 0.3338 | 0.5561 | 0.1523 |
0.4095 | 4.0486 | 9000 | 0.3193 | 0.5592 | 0.1545 |
0.3875 | 4.2735 | 9500 | 0.3185 | 0.5538 | 0.1523 |
0.3915 | 4.4984 | 10000 | 0.3119 | 0.5534 | 0.1523 |
0.3846 | 4.7233 | 10500 | 0.3222 | 0.5516 | 0.1513 |
0.3817 | 4.9483 | 11000 | 0.3206 | 0.5484 | 0.1510 |
0.3129 | 5.1732 | 11500 | 0.3232 | 0.5473 | 0.1504 |
Framework versions
- Transformers 4.50.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 74
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Leonel-Maia/mHuBERT-147
Base model
utter-project/mHuBERT-147