mistral-7b-80k / README.md
hongyin's picture
Upload README.md
58346a6
|
raw
history blame
1.06 kB
metadata
language:
  - en
  - zh
pipeline_tag: text-generation

hongyin/informer-7b-80k

I am pleased to introduce an English-Chinese bilingual autoregressive language model. The model is trained on Llama2-7b, with a unique vocabulary and 7 billion parameters. Our goal is to provide a solution that is computationally cheap and easy to reason about. It is important to note that this is a basic model, not intended for use as a chatbot, but for alchemy. We look forward to providing you with practical model products.

To put aside the damn high-sounding words, the name of each model has rich connotations and personal experience, including the previous model, and it is worth reminding us repeatedly.


Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other