Update README.md
Browse files
README.md
CHANGED
@@ -44,11 +44,44 @@ tokenizer = RobertaTokenizer.from_pretrained('dsfsi/PuoBERTa')
|
|
44 |
|
45 |
### Downstream Use
|
46 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
47 |
## Dataset
|
48 |
|
49 |
We used the PuoData dataset, a rich source of Setswana text, ensuring that our model is well-trained and culturally attuned.\\
|
50 |
|
51 |
-
|
52 |
|
53 |
Bibtex Refrence
|
54 |
|
|
|
44 |
|
45 |
### Downstream Use
|
46 |
|
47 |
+
## Downstream Performance
|
48 |
+
|
49 |
+
### MasakhaPOS
|
50 |
+
|
51 |
+
Performance of models on the MasakhaPOS downstream task.
|
52 |
+
|
53 |
+
| Model | Test Performance |
|
54 |
+
|---|---|
|
55 |
+
| **Multilingual Models** | |
|
56 |
+
| AfroLM | 83.8 |
|
57 |
+
| AfriBERTa | 82.5 |
|
58 |
+
| AfroXLMR-base | 82.7 |
|
59 |
+
| AfroXLMR-large | 83.0 |
|
60 |
+
| **Monolingual Models** | |
|
61 |
+
| NCHLT TSN RoBERTa | 82.3 |
|
62 |
+
| PuoBERTa | **83.4** |
|
63 |
+
| PuoBERTa+JW300 | 84.1 |
|
64 |
+
|
65 |
+
### MasakhaNER
|
66 |
+
|
67 |
+
Performance of models on the MasakhaNER downstream task.
|
68 |
+
|
69 |
+
| Model | Test Performance (f1 score) |
|
70 |
+
|---|---|
|
71 |
+
| **Multilingual Models** | |
|
72 |
+
| AfriBERTa | 83.2 |
|
73 |
+
| AfroXLMR-base | 87.7 |
|
74 |
+
| AfroXLMR-large | \textbf{89.4} |
|
75 |
+
| **Monolingual Models** | |
|
76 |
+
| NCHLT TSN RoBERTa | 74.2 |
|
77 |
+
| PuoBERTa | **78.2** |
|
78 |
+
| PuoBERTa+JW300 | 80.2 |
|
79 |
+
|
80 |
## Dataset
|
81 |
|
82 |
We used the PuoData dataset, a rich source of Setswana text, ensuring that our model is well-trained and culturally attuned.\\
|
83 |
|
84 |
+
## Citation Information
|
85 |
|
86 |
Bibtex Refrence
|
87 |
|