File size: 924 Bytes
58346a6
 
 
 
 
 
8a9eb4a
58346a6
8a9eb4a
58346a6
8a9eb4a
58346a6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
language:
- en
- zh
pipeline_tag: text-generation
---
## hongyin/mistral-7b-80k

I am pleased to introduce an English-Chinese bilingual autoregressive language model. The model is trained on Mistral-7b, with a unique vocabulary and 7 billion parameters. Our goal is to provide a solution that is computationally cheap and easy to reason about. It is important to note that this is a basic model, not intended for use as a chatbot, but for alchemy. We look forward to providing you with practical model products.

Losing fat is the only way to solve all problems.
```python

```

## Bibtex entry and citation info
Please cite if you find it helpful.
```
@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

```

---
license: other
---