mistral-7b-80k / README.md
hongyin's picture
Update README.md
8a9eb4a
|
raw
history blame
924 Bytes
---
language:
- en
- zh
pipeline_tag: text-generation
---
## hongyin/mistral-7b-80k
I am pleased to introduce an English-Chinese bilingual autoregressive language model. The model is trained on Mistral-7b, with a unique vocabulary and 7 billion parameters. Our goal is to provide a solution that is computationally cheap and easy to reason about. It is important to note that this is a basic model, not intended for use as a chatbot, but for alchemy. We look forward to providing you with practical model products.
Losing fat is the only way to solve all problems.
```python
```
## Bibtex entry and citation info
Please cite if you find it helpful.
```
@article{zhu2023metaaid,
title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
author={Zhu, Hongyin},
journal={arXiv preprint arXiv:2302.13173},
year={2023}
}
```
---
license: other
---