File size: 642 Bytes
88ed27a
 
 
41de832
88ed27a
41de832
88ed27a
41de832
88ed27a
e45d1bc
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
license: apache-2.0
---
## Just to obtain metrics from the `HuggingFaceH4/open_llm_leaderboard`.

To evaluate the impact of increasing the number of experts, modify the `num_experts_per_tok` setting in the `config.json` file from 2 to 3. This alteration aims to specifically determine if such a change leads to any notable improvements in performance metrics.

Other details to note include that the model weights are directly copied from the source available at https://huggingface.co/mistralai/Mixtral-8x7B-v0.1.



![image/png](https://cdn-uploads.huggingface.co/production/uploads/643fb889b9ba82afb66d6b36/heAOiPKp5XSSh-drFQ74l.png)