Update README.md
Browse files
README.md
CHANGED
|
@@ -18,7 +18,9 @@ base_model:
|
|
| 18 |
|
| 19 |
## Introduction
|
| 20 |
|
| 21 |
-
Ring-lite is a fully open-source MoE LLM provided by InclusionAI, which has 16.8B parameters with 2.75B activated parameters.
|
|
|
|
|
|
|
| 22 |
|
| 23 |
## Model Downloads
|
| 24 |
|
|
|
|
| 18 |
|
| 19 |
## Introduction
|
| 20 |
|
| 21 |
+
Ring-lite is a fully open-source MoE LLM provided by InclusionAI, which has 16.8B parameters with 2.75B activated parameters. Ring-lite builds upon the publicly available [Ling-lite-1.5](https://huggingface.co/inclusionAI/Ling-lite-1.5) model, which contains 16.8 billion total parameters with 2.75 billion activated parameters. Our model achieves performance comparable to state-of-the-art (SOTA) small-size reasoning models on challenging benchmarks (AIME, LiveCodeBench, and GPQA-Diamond) while activating only one-third of their parameters. We use joint training pipeline combining knowledge distillation with RL
|
| 22 |
+
|
| 23 |
+
|
| 24 |
|
| 25 |
## Model Downloads
|
| 26 |
|