08 -> 09
Browse files
README.md
CHANGED
@@ -6,17 +6,17 @@ language:
|
|
6 |
size_categories:
|
7 |
- 1B<n<10B
|
8 |
license: odc-by
|
9 |
-
pretty_name: OLMoE Mix (
|
10 |
---
|
11 |
|
12 |
-
# OLMoE Mix (
|
13 |
|
14 |
|
15 |
<img alt="OLMoE Mix Logo." src="olmoe-mix.png" width="250px">
|
16 |
|
17 |
-
The following data mix was used to train OLMoE-1B-7B, a Mixture-of-Experts LLM with 1B active and 7B total parameters released in
|
18 |
|
19 |
-
The base version of OLMoE-1B-7B can be found at [this page](https://huggingface.co/OLMoE/OLMoE-1B-7B-
|
20 |
|
21 |
## Statistics
|
22 |
|
|
|
6 |
size_categories:
|
7 |
- 1B<n<10B
|
8 |
license: odc-by
|
9 |
+
pretty_name: OLMoE Mix (September 2024)
|
10 |
---
|
11 |
|
12 |
+
# OLMoE Mix (September 2024)
|
13 |
|
14 |
|
15 |
<img alt="OLMoE Mix Logo." src="olmoe-mix.png" width="250px">
|
16 |
|
17 |
+
The following data mix was used to train OLMoE-1B-7B, a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024.
|
18 |
|
19 |
+
The base version of OLMoE-1B-7B can be found at [this page](https://huggingface.co/OLMoE/OLMoE-1B-7B-0924), the SFT of OLMoE-1B-7B is available [here](https://huggingface.co/OLMoE/OLMoE-1B-7B-0924-SFT), and a version combining SFT and DPO is available following [this link](https://huggingface.co/OLMoE/OLMoE-1B-7B-0924-Instruct).
|
20 |
|
21 |
## Statistics
|
22 |
|