Update README.md
Browse files
README.md
CHANGED
@@ -99,10 +99,10 @@ if __name__ == '__main__':
|
|
99 |
</center>
|
100 |
## Evaluation
|
101 |
|
102 |
-
We evaluate the multimodal performance across a variety of datasets: VizWiz
|
103 |
-
, MM-Vet
|
104 |
|
105 |
-
| Method | LLM | Res. | VizWiz | SQA | VQA | POPE | GQA | MMB | MMB
|
106 |
|:--------------:|:----------------:|:----:|:------:|:----:|:----:|:----:|:----:|:----:|:--------:|:------:|:------:|
|
107 |
| Openflamingo | MPT-7B | 336 | - | - | 33.6 | - | - | 4.6 | - | - | - |
|
108 |
| BLIP-2 | Vicuna-13B | 224 | - | 61.0 | 42.5 | 85.3 | 41.0 | - | - | - | 1293.8 |
|
|
|
99 |
</center>
|
100 |
## Evaluation
|
101 |
|
102 |
+
We evaluate the multimodal performance across a variety of datasets: **VizWiz**, **SQA<sup>I</sup>**, **VQA<sup>T</sup>**, **POPE**, **GQA**, **MMB**, **MMB<sup>CN</sup>**
|
103 |
+
, **MM-Vet**, and **MME**. Our analysis, as depicted in Table~\ref{tab:compare-with-sotas-vlms}.
|
104 |
|
105 |
+
| Method | LLM | Res. | VizWiz | SQA | VQA | POPE | GQA | MMB | MMB<sup>CN</sup> | MM-Vet | MME |
|
106 |
|:--------------:|:----------------:|:----:|:------:|:----:|:----:|:----:|:----:|:----:|:--------:|:------:|:------:|
|
107 |
| Openflamingo | MPT-7B | 336 | - | - | 33.6 | - | - | 4.6 | - | - | - |
|
108 |
| BLIP-2 | Vicuna-13B | 224 | - | 61.0 | 42.5 | 85.3 | 41.0 | - | - | - | 1293.8 |
|