Yoruba W2v-BERT 2.0 Models
Collection
5 items
โข
Updated
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the NAIJAVOICES_YORUBA_250H - NA dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.9344 | 1.2837 | 1000 | 0.8891 | 0.6861 | 0.3482 |
0.7957 | 2.5674 | 2000 | 0.6957 | 0.6096 | 0.3201 |
0.6502 | 3.8511 | 3000 | 0.6526 | 0.5753 | 0.3087 |
0.6207 | 5.1348 | 4000 | 0.6331 | 0.5625 | 0.3179 |
0.6032 | 6.4185 | 5000 | 0.6054 | 0.5553 | 0.3155 |
0.7118 | 7.7022 | 6000 | 0.6006 | 0.5580 | 0.2983 |
0.5953 | 8.9859 | 7000 | 0.5827 | 0.5371 | 0.2933 |
0.5799 | 10.2696 | 8000 | 0.5903 | 0.5310 | 0.3103 |
0.5432 | 11.5533 | 9000 | 0.5545 | 0.5284 | 0.2891 |
0.5047 | 12.8370 | 10000 | 0.5423 | 0.5197 | 0.2914 |
0.5812 | 14.1207 | 11000 | 0.5366 | 0.5127 | 0.2982 |
0.5 | 15.4044 | 12000 | 0.5234 | 0.5072 | 0.2837 |
0.5229 | 16.6881 | 13000 | 0.5180 | 0.5050 | 0.2851 |
0.5162 | 17.9718 | 14000 | 0.5127 | 0.5044 | 0.2859 |
0.5709 | 19.2555 | 15000 | 0.5088 | 0.5009 | 0.2831 |
0.4316 | 20.5392 | 16000 | 0.5026 | 0.4837 | 0.2828 |
0.4892 | 21.8228 | 17000 | 0.4984 | 0.4861 | 0.2830 |
0.3795 | 23.1065 | 18000 | 0.5082 | 0.4826 | 0.2728 |
0.3209 | 24.3902 | 19000 | 0.4842 | 0.4724 | 0.2712 |
0.3633 | 25.6739 | 20000 | 0.4732 | 0.4659 | 0.2720 |
0.4002 | 26.9576 | 21000 | 0.4832 | 0.4618 | 0.2665 |
0.2956 | 28.2413 | 22000 | 0.4756 | 0.4444 | 0.2646 |
0.2779 | 29.5250 | 23000 | 0.4837 | 0.4499 | 0.2680 |
0.2933 | 30.8087 | 24000 | 0.4563 | 0.4453 | 0.2662 |
0.2412 | 32.0924 | 25000 | 0.4681 | 0.4410 | 0.2622 |
0.2966 | 33.3761 | 26000 | 0.4804 | 0.4453 | 0.2607 |
0.2494 | 34.6598 | 27000 | 0.4721 | 0.4418 | 0.2639 |
0.2091 | 35.9435 | 28000 | 0.4782 | 0.4383 | 0.2615 |
0.2055 | 37.2272 | 29000 | 0.4915 | 0.4371 | 0.2602 |
Base model
facebook/w2v-bert-2.0