Update README.md
Browse files
README.md
CHANGED
@@ -241,7 +241,7 @@ And the learning rate curve:
|
|
241 |
```
|
242 |
xtuner convert pth_to_hf ./finetune.py ./work_dirs/iter_xxx.pth ./my_lora_and_projector
|
243 |
```
|
244 |
-
The adapter still need to be used with the internlm/internlm2-chat-
|
245 |
|
246 |
## MMBench Evaluation
|
247 |
You can first download the MMBench data:
|
|
|
241 |
```
|
242 |
xtuner convert pth_to_hf ./finetune.py ./work_dirs/iter_xxx.pth ./my_lora_and_projector
|
243 |
```
|
244 |
+
The adapter still need to be used with the internlm/internlm2-chat-1_8b and the vision encoder. I have not tried to merge them yet but it is possible with Xtuner, see this [tutorial](https://github.com/InternLM/xtuner/blob/f63859b3d0cb39cbac709e3850f3fe01de1023aa/xtuner/configs/llava/README.md#L4).
|
245 |
|
246 |
## MMBench Evaluation
|
247 |
You can first download the MMBench data:
|