wav2vec-bert-2.0-ulch-try
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the audiofolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.3563
- Wer: 0.4872
- Cer: 0.1704
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
4.4139 | 0.1219 | 180 | 1.7147 | 0.9451 | 0.4056 |
1.5054 | 0.2438 | 360 | 1.4105 | 0.8582 | 0.3543 |
1.4072 | 0.3656 | 540 | 1.3314 | 0.8832 | 0.3701 |
1.236 | 0.4875 | 720 | 1.1863 | 0.7590 | 0.2799 |
1.1715 | 0.6094 | 900 | 1.1086 | 0.7380 | 0.2837 |
1.1484 | 0.7313 | 1080 | 1.0823 | 0.6931 | 0.2509 |
1.1007 | 0.8532 | 1260 | 1.0204 | 0.7211 | 0.2560 |
1.0404 | 0.9750 | 1440 | 0.9578 | 0.7289 | 0.2511 |
0.9031 | 1.0968 | 1620 | 0.9397 | 0.6513 | 0.2303 |
0.9951 | 1.2187 | 1800 | 0.9252 | 0.6504 | 0.2309 |
0.842 | 1.3406 | 1980 | 0.9184 | 0.6139 | 0.2190 |
0.8247 | 1.4625 | 2160 | 0.8968 | 0.6143 | 0.2208 |
0.867 | 1.5843 | 2340 | 0.8945 | 0.6140 | 0.2192 |
0.8674 | 1.7062 | 2520 | 0.8362 | 0.6092 | 0.2097 |
0.818 | 1.8281 | 2700 | 0.8848 | 0.6116 | 0.2130 |
0.7734 | 1.9500 | 2880 | 0.8408 | 0.5740 | 0.2002 |
0.7266 | 2.0718 | 3060 | 0.8358 | 0.5806 | 0.2041 |
0.6954 | 2.1937 | 3240 | 0.8125 | 0.5705 | 0.1985 |
0.6441 | 2.3155 | 3420 | 0.8175 | 0.5714 | 0.1985 |
0.6673 | 2.4374 | 3600 | 0.8032 | 0.5637 | 0.1974 |
0.6947 | 2.5593 | 3780 | 0.8041 | 0.5884 | 0.2165 |
0.6856 | 2.6812 | 3960 | 0.7781 | 0.5513 | 0.1952 |
0.6401 | 2.8030 | 4140 | 0.7975 | 0.5548 | 0.1935 |
0.6342 | 2.9249 | 4320 | 0.7842 | 0.5377 | 0.1899 |
0.6434 | 3.0467 | 4500 | 0.7777 | 0.5395 | 0.1874 |
0.5545 | 3.1686 | 4680 | 0.7825 | 0.5479 | 0.1941 |
0.5798 | 3.2905 | 4860 | 0.7889 | 0.5335 | 0.1866 |
0.5142 | 3.4124 | 5040 | 0.7797 | 0.5371 | 0.1918 |
0.5413 | 3.5342 | 5220 | 0.7797 | 0.5461 | 0.1891 |
0.5282 | 3.6561 | 5400 | 0.7814 | 0.5357 | 0.1868 |
0.5308 | 3.7780 | 5580 | 0.7649 | 0.5202 | 0.1802 |
0.4949 | 3.8999 | 5760 | 0.7638 | 0.5458 | 0.1856 |
0.4853 | 4.0217 | 5940 | 0.7786 | 0.5247 | 0.1803 |
0.4026 | 4.1435 | 6120 | 0.7678 | 0.5277 | 0.1857 |
0.428 | 4.2654 | 6300 | 0.7772 | 0.5217 | 0.1795 |
0.394 | 4.3873 | 6480 | 0.7642 | 0.5118 | 0.1779 |
0.418 | 4.5092 | 6660 | 0.7672 | 0.5219 | 0.1809 |
0.4245 | 4.6311 | 6840 | 0.7464 | 0.5113 | 0.1811 |
0.4657 | 4.7529 | 7020 | 0.7470 | 0.5211 | 0.1859 |
0.4285 | 4.8748 | 7200 | 0.7455 | 0.5269 | 0.1858 |
0.416 | 4.9967 | 7380 | 0.7357 | 0.5325 | 0.1860 |
0.3282 | 5.1185 | 7560 | 0.8268 | 0.5086 | 0.1791 |
0.3192 | 5.2404 | 7740 | 0.8087 | 0.5088 | 0.1781 |
0.3262 | 5.3623 | 7920 | 0.7779 | 0.5161 | 0.1836 |
0.3152 | 5.4841 | 8100 | 0.8079 | 0.5064 | 0.1766 |
0.3622 | 5.6060 | 8280 | 0.8168 | 0.5062 | 0.1789 |
0.318 | 5.7279 | 8460 | 0.8088 | 0.4976 | 0.1744 |
0.3107 | 5.8498 | 8640 | 0.8074 | 0.5020 | 0.1792 |
0.3017 | 5.9716 | 8820 | 0.7807 | 0.5068 | 0.1759 |
0.2594 | 6.0934 | 9000 | 0.9053 | 0.5003 | 0.1740 |
0.2994 | 6.2153 | 9180 | 0.8920 | 0.5022 | 0.1746 |
0.2288 | 6.3372 | 9360 | 0.9006 | 0.4981 | 0.1773 |
0.2345 | 6.4591 | 9540 | 0.8932 | 0.4963 | 0.1761 |
0.2287 | 6.5810 | 9720 | 0.8502 | 0.4961 | 0.1734 |
0.2291 | 6.7028 | 9900 | 0.8334 | 0.5010 | 0.1787 |
0.2075 | 6.8247 | 10080 | 0.8773 | 0.4869 | 0.1726 |
0.2182 | 6.9466 | 10260 | 0.8423 | 0.4964 | 0.1754 |
0.1815 | 7.0684 | 10440 | 0.9376 | 0.4936 | 0.1733 |
0.1586 | 7.1903 | 10620 | 0.9601 | 0.5016 | 0.1751 |
0.1578 | 7.3121 | 10800 | 0.9650 | 0.4894 | 0.1726 |
0.1489 | 7.4340 | 10980 | 0.9943 | 0.4852 | 0.1700 |
0.153 | 7.5559 | 11160 | 1.0456 | 0.4832 | 0.1686 |
0.1501 | 7.6778 | 11340 | 1.0008 | 0.4857 | 0.1714 |
0.1397 | 7.7997 | 11520 | 0.9729 | 0.4878 | 0.1740 |
0.1554 | 7.9215 | 11700 | 0.9443 | 0.4826 | 0.1710 |
0.1732 | 8.0433 | 11880 | 1.0668 | 0.4854 | 0.1707 |
0.0863 | 8.1652 | 12060 | 1.1487 | 0.4849 | 0.1708 |
0.0886 | 8.2871 | 12240 | 1.1944 | 0.4866 | 0.1714 |
0.0986 | 8.4090 | 12420 | 1.1378 | 0.4818 | 0.1691 |
0.137 | 8.5309 | 12600 | 1.1633 | 0.4923 | 0.1716 |
0.0954 | 8.6527 | 12780 | 1.1348 | 0.4910 | 0.1713 |
0.101 | 8.7746 | 12960 | 1.0933 | 0.4883 | 0.1715 |
0.0833 | 8.8965 | 13140 | 1.1138 | 0.4951 | 0.1721 |
0.1005 | 9.0183 | 13320 | 1.1980 | 0.4866 | 0.1699 |
0.0476 | 9.1402 | 13500 | 1.3154 | 0.4908 | 0.1721 |
0.0529 | 9.2620 | 13680 | 1.3140 | 0.4874 | 0.1720 |
0.0532 | 9.3839 | 13860 | 1.3547 | 0.4873 | 0.1711 |
0.0499 | 9.5058 | 14040 | 1.3280 | 0.4850 | 0.1704 |
0.0538 | 9.6277 | 14220 | 1.3608 | 0.4858 | 0.1705 |
0.0529 | 9.7496 | 14400 | 1.3606 | 0.4838 | 0.1704 |
0.0504 | 9.8714 | 14580 | 1.3553 | 0.4876 | 0.1706 |
0.0411 | 9.9933 | 14760 | 1.3563 | 0.4872 | 0.1704 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.1
- Tokenizers 0.21.1
- Downloads last month
- 46
Model tree for Ber5h/wav2vec-bert-2.0-ulch-try
Base model
facebook/w2v-bert-2.0