File size: 1,878 Bytes
fe18f92
 
 
 
 
 
 
 
f59cf85
e3263cf
 
f59cf85
 
e3263cf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d790059
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
---
license: llama3
language:
- en
library_name: transformers
tags:
- text Generation
---
# Mixllama3-8x8b-Instruct-v0.1 based on LLaMA 3

An experimental MoE (Mixture of Experts) model based on the LLaMA-3-8B. 
MixLLaMA3-8x8b combines 8 fine-tuned LLaMA 8B models, each specialized in a specific set of tasks. 
By leveraging the strengths of each expert model, Mixllama3-8x8b aims to deliver enhanced performance and adaptability across a wide range of applications.


![image/gif](https://cdn-uploads.huggingface.co/production/uploads/64414d01bd0c97265297acc5/OQ-cZNYe_2r1JK4Z6fCgg.gif)

## Disclaimer

This model is a research experiment and may generate incorrect or harmful content. The model's outputs should not be taken as factual or representative of the views of the model's creator or any other individual.

The model's creator is not responsible for any harm or damage caused by the model's outputs.

## Merge Details

```
base_model: meta-llama/Meta-Llama-3-8B-Instruct
experts:
  - source_model: meta-llama/Meta-Llama-3-8B-Instruct
    positive_prompts:
    - "assistant"
  - source_model: Muhammad2003/Llama3-8B-OpenHermes-DPO
    positive_prompts:
    - "python"
  - source_model: cognitivecomputations/dolphin-2.9-llama3-8b
    positive_prompts:
    - "chat"
  - source_model: orpo-explorers/hf-llama3-8b-orpo-v0.1.4
    positive_prompts:
    - "code"
  - source_model: Locutusque/llama-3-neural-chat-v1-8b
    positive_prompts:
    - "math"
  - source_model: mlabonne/Llama-3-SLERP-8B
    positive_prompts:
    - "AI"
  - source_model: meta-llama/Meta-Llama-3-8B
    positive_prompts:
    - "explain"
  - source_model: dreamgen/opus-v1.2-llama-3-8b
    positive_prompts:
    - "Role playing"
gate_mode: cheap_embed
dtype: float16
```

**Meta Llama 3 is
licensed under the Meta Llama 3 Community License, Copyright © Meta Platforms, Inc. All Rights
Reserved.**