Yoruba W2v-BERT 2.0 Models
Collection
5 items
โข
Updated
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the NAIJAVOICES_YORUBA_100H - NA dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.8442 | 3.2051 | 1000 | 0.8847 | 0.6790 | 0.3471 |
0.6917 | 6.4103 | 2000 | 0.7064 | 0.6078 | 0.3204 |
0.7078 | 9.6154 | 3000 | 0.6668 | 0.5840 | 0.3096 |
0.6132 | 12.8205 | 4000 | 0.6308 | 0.5640 | 0.3052 |
0.577 | 16.0256 | 5000 | 0.6232 | 0.5551 | 0.3010 |
0.5954 | 19.2308 | 6000 | 0.6014 | 0.5390 | 0.2942 |
0.5286 | 22.4359 | 7000 | 0.6020 | 0.5449 | 0.2986 |
0.7866 | 25.6410 | 8000 | 0.5781 | 0.5326 | 0.2904 |
0.4878 | 28.8462 | 9000 | 0.5705 | 0.5100 | 0.2935 |
0.4091 | 32.0513 | 10000 | 0.5647 | 0.5218 | 0.2857 |
0.3913 | 35.2564 | 11000 | 0.5915 | 0.5248 | 0.2921 |
0.3878 | 38.4615 | 12000 | 0.6052 | 0.5296 | 0.2998 |
0.364 | 41.6667 | 13000 | 0.6151 | 0.4967 | 0.2802 |
0.3021 | 44.8718 | 14000 | 0.6075 | 0.4997 | 0.2861 |
0.2587 | 48.0769 | 15000 | 0.6518 | 0.5329 | 0.3016 |
Base model
facebook/w2v-bert-2.0