update README.md
Browse files
README.md
CHANGED
|
@@ -12,14 +12,14 @@ library_name: transformers
|
|
| 12 |
|
| 13 |
<p align="center">
|
| 14 |
<a href="https://github.com/OpenBMB/MiniCPM/" target="_blank">GitHub Repo</a> |
|
| 15 |
-
<a href="
|
| 16 |
</p>
|
| 17 |
<p align="center">
|
| 18 |
π Join us on <a href="https://discord.gg/3cGQn9b3YM" target="_blank">Discord</a> and <a href="https://github.com/OpenBMB/MiniCPM/blob/main/assets/wechat.jpg" target="_blank">WeChat</a>
|
| 19 |
</p>
|
| 20 |
|
| 21 |
## What's New
|
| 22 |
-
- [2025.06.06] **MiniCPM4** series are released! This model achieves ultimate efficiency improvements while maintaining optimal performance at the same scale! It can achieve over 5x generation acceleration on typical end-side chips! You can find technical report
|
| 23 |
|
| 24 |
## MiniCPM4 Series
|
| 25 |
MiniCPM4 series are highly efficient large language models (LLMs) designed explicitly for end-side devices, which achieves this efficiency through systematic innovation in four key dimensions: model architecture, training data, training algorithms, and inference systems.
|
|
@@ -30,7 +30,7 @@ MiniCPM4 series are highly efficient large language models (LLMs) designed expli
|
|
| 30 |
- [BitCPM4-0.5B](https://huggingface.co/openbmb/BitCPM4-0.5B): Extreme ternary quantization applied to MiniCPM4-0.5B compresses model parameters into ternary values, achieving a 90% reduction in bit width.
|
| 31 |
- [BitCPM4-1B](https://huggingface.co/openbmb/BitCPM4-1B): Extreme ternary quantization applied to MiniCPM3-1B compresses model parameters into ternary values, achieving a 90% reduction in bit width. (**<-- you are here**)
|
| 32 |
- [MiniCPM4-Survey](https://huggingface.co/openbmb/MiniCPM4-Survey): Based on MiniCPM4-8B, accepts users' quiries as input and autonomously generate trustworthy, long-form survey papers.
|
| 33 |
-
- [MiniCPM4-MCP](https://huggingface.co/openbmb/MiniCPM4-MCP): Based on MiniCPM4-8B, accepts users' queries and available MCP tools as input and autonomously calls relevant MCP tools to satisfy
|
| 34 |
|
| 35 |
## Introduction
|
| 36 |
BitCPM4 are ternary quantized models derived from the MiniCPM series models through quantization-aware training (QAT), achieving significant improvements in both training efficiency and model parameter efficiency.
|
|
@@ -84,14 +84,15 @@ BitCPM4's performance is comparable with other full-precision models in same mod
|
|
| 84 |
- Therefore, when using content generated by MiniCPM, users should take full responsibility for evaluating and verifying it on their own.
|
| 85 |
|
| 86 |
## LICENSE
|
| 87 |
-
- This repository
|
| 88 |
-
- The usage of MiniCPM model weights must strictly follow [MiniCPM Model License](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md).
|
| 89 |
-
- The models and weights of MiniCPM are completely free for academic research. after filling out a [questionnaire](https://modelbest.feishu.cn/share/base/form/shrcnpV5ZT9EJ6xYjh3Kx0J6v8g) for registration, are also available for free commercial use.
|
| 90 |
|
| 91 |
## Citation
|
| 92 |
-
|
| 93 |
-
- Please cite our [paper](TODO) if you find our work valuable.
|
| 94 |
|
| 95 |
```bibtex
|
| 96 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 97 |
```
|
|
|
|
| 12 |
|
| 13 |
<p align="center">
|
| 14 |
<a href="https://github.com/OpenBMB/MiniCPM/" target="_blank">GitHub Repo</a> |
|
| 15 |
+
<a href="https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf" target="_blank">Technical Report</a>
|
| 16 |
</p>
|
| 17 |
<p align="center">
|
| 18 |
π Join us on <a href="https://discord.gg/3cGQn9b3YM" target="_blank">Discord</a> and <a href="https://github.com/OpenBMB/MiniCPM/blob/main/assets/wechat.jpg" target="_blank">WeChat</a>
|
| 19 |
</p>
|
| 20 |
|
| 21 |
## What's New
|
| 22 |
+
- [2025.06.06] **MiniCPM4** series are released! This model achieves ultimate efficiency improvements while maintaining optimal performance at the same scale! It can achieve over 5x generation acceleration on typical end-side chips! You can find technical report [here](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf).π₯π₯π₯
|
| 23 |
|
| 24 |
## MiniCPM4 Series
|
| 25 |
MiniCPM4 series are highly efficient large language models (LLMs) designed explicitly for end-side devices, which achieves this efficiency through systematic innovation in four key dimensions: model architecture, training data, training algorithms, and inference systems.
|
|
|
|
| 30 |
- [BitCPM4-0.5B](https://huggingface.co/openbmb/BitCPM4-0.5B): Extreme ternary quantization applied to MiniCPM4-0.5B compresses model parameters into ternary values, achieving a 90% reduction in bit width.
|
| 31 |
- [BitCPM4-1B](https://huggingface.co/openbmb/BitCPM4-1B): Extreme ternary quantization applied to MiniCPM3-1B compresses model parameters into ternary values, achieving a 90% reduction in bit width. (**<-- you are here**)
|
| 32 |
- [MiniCPM4-Survey](https://huggingface.co/openbmb/MiniCPM4-Survey): Based on MiniCPM4-8B, accepts users' quiries as input and autonomously generate trustworthy, long-form survey papers.
|
| 33 |
+
- [MiniCPM4-MCP](https://huggingface.co/openbmb/MiniCPM4-MCP): Based on MiniCPM4-8B, accepts users' queries and available MCP tools as input and autonomously calls relevant MCP tools to satisfy users' requirements.
|
| 34 |
|
| 35 |
## Introduction
|
| 36 |
BitCPM4 are ternary quantized models derived from the MiniCPM series models through quantization-aware training (QAT), achieving significant improvements in both training efficiency and model parameter efficiency.
|
|
|
|
| 84 |
- Therefore, when using content generated by MiniCPM, users should take full responsibility for evaluating and verifying it on their own.
|
| 85 |
|
| 86 |
## LICENSE
|
| 87 |
+
- This repository and MiniCPM models are released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
|
|
|
|
|
|
|
| 88 |
|
| 89 |
## Citation
|
| 90 |
+
- Please cite our [paper](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf) if you find our work valuable.
|
|
|
|
| 91 |
|
| 92 |
```bibtex
|
| 93 |
+
@article{minicpm4,
|
| 94 |
+
title={{MiniCPM4}: Ultra-Efficient LLMs on End Devices},
|
| 95 |
+
author={MiniCPM Team},
|
| 96 |
+
year={2025}
|
| 97 |
+
}
|
| 98 |
```
|