Update README.md
Browse files
README.md
CHANGED
@@ -28,7 +28,7 @@ library_name: transformers
|
|
28 |
|
|
29 |
<a href="https://www.modelscope.cn/organization/XiaomiMiMo" target="_blank">🤖️ ModelScope</a>
|
30 |
|
|
31 |
-
<a href="https://
|
32 |
|
|
33 |
<br/>
|
34 |
</div>
|
@@ -131,7 +131,7 @@ Example Script
|
|
131 |
python3 -m uv pip install "sglang[all] @ git+https://github.com/sgl-project/sglang.git/@main#egg=sglang&subdirectory=python"
|
132 |
|
133 |
# Launch SGLang Server
|
134 |
-
python3 -m sglang.launch_server --model-path XiaomiMiMo/MiMo-7B-
|
135 |
```
|
136 |
|
137 |
Detailed usage can be found in [SGLang documents](https://docs.sglang.ai/backend/send_request.html). MTP will also be supported in 24h.
|
@@ -221,16 +221,18 @@ print(tokenizer.decode(output.tolist()[0]))
|
|
221 |
## V. Citation
|
222 |
|
223 |
```bibtex
|
224 |
-
@misc{
|
225 |
-
title={MiMo: Unlocking the Reasoning Potential of Language Model
|
226 |
author={{Xiaomi LLM-Core Team}},
|
227 |
year={2025},
|
|
|
|
|
228 |
primaryClass={cs.CL},
|
229 |
-
url={https://
|
230 |
}
|
231 |
```
|
232 |
|
233 |
|
234 |
## VI. Contact
|
235 |
|
236 |
-
Please contact us at [[email protected]](mailto:[email protected]) or open an issue if you have any questions.
|
|
|
28 |
|
|
29 |
<a href="https://www.modelscope.cn/organization/XiaomiMiMo" target="_blank">🤖️ ModelScope</a>
|
30 |
|
|
31 |
+
<a href="https://arxiv.org/abs/2505.07608" target="_blank">📔 Technical Report</a>
|
32 |
|
|
33 |
<br/>
|
34 |
</div>
|
|
|
131 |
python3 -m uv pip install "sglang[all] @ git+https://github.com/sgl-project/sglang.git/@main#egg=sglang&subdirectory=python"
|
132 |
|
133 |
# Launch SGLang Server
|
134 |
+
python3 -m sglang.launch_server --model-path XiaomiMiMo/MiMo-7B-Base --host 0.0.0.0 --trust-remote-code
|
135 |
```
|
136 |
|
137 |
Detailed usage can be found in [SGLang documents](https://docs.sglang.ai/backend/send_request.html). MTP will also be supported in 24h.
|
|
|
221 |
## V. Citation
|
222 |
|
223 |
```bibtex
|
224 |
+
@misc{coreteam2025mimounlockingreasoningpotential,
|
225 |
+
title={MiMo: Unlocking the Reasoning Potential of Language Model -- From Pretraining to Posttraining},
|
226 |
author={{Xiaomi LLM-Core Team}},
|
227 |
year={2025},
|
228 |
+
eprint={2505.07608},
|
229 |
+
archivePrefix={arXiv},
|
230 |
primaryClass={cs.CL},
|
231 |
+
url={https://arxiv.org/abs/2505.07608},
|
232 |
}
|
233 |
```
|
234 |
|
235 |
|
236 |
## VI. Contact
|
237 |
|
238 |
+
Please contact us at [[email protected]](mailto:[email protected]) or open an issue if you have any questions.
|