Commit
·
05067c3
1
Parent(s):
ec40b36
Update README.md
Browse files
README.md
CHANGED
|
@@ -43,7 +43,6 @@ The following models are finetuned on MPT-30B:
|
|
| 43 |
* [MPT-30B-Instruct](https://huggingface.co/mosaicml/mpt-30b-instruct): a model for short-form instruction following.
|
| 44 |
Built by finetuning MPT-30B on several carefully curated datasets.
|
| 45 |
* License: _CC-By-NC-SA-3.0_
|
| 46 |
-
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-30b-instruct)
|
| 47 |
|
| 48 |
* [MPT-30B-Chat](https://huggingface.co/mosaicml/mpt-30b-chat): a chatbot-like model for dialogue generation.
|
| 49 |
Built by finetuning MPT-30B on [ShareGPT-Vicuna](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered), [Camel-AI](https://huggingface.co/camel-ai),
|
|
@@ -61,7 +60,7 @@ Apache-2.0
|
|
| 61 |
|
| 62 |
## Documentation
|
| 63 |
|
| 64 |
-
* [Blog post: MPT-30B: Raising the bar for open-source
|
| 65 |
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
|
| 66 |
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
|
| 67 |
|
|
@@ -236,8 +235,8 @@ Please cite this model using the following format:
|
|
| 236 |
```
|
| 237 |
@online{MosaicML2023Introducing,
|
| 238 |
author = {MosaicML NLP Team},
|
| 239 |
-
title = {Introducing MPT-30B:
|
| 240 |
-
|
| 241 |
year = {2023},
|
| 242 |
url = {www.mosaicml.com/blog/mpt-30b},
|
| 243 |
note = {Accessed: 2023-06-22},
|
|
|
|
| 43 |
* [MPT-30B-Instruct](https://huggingface.co/mosaicml/mpt-30b-instruct): a model for short-form instruction following.
|
| 44 |
Built by finetuning MPT-30B on several carefully curated datasets.
|
| 45 |
* License: _CC-By-NC-SA-3.0_
|
|
|
|
| 46 |
|
| 47 |
* [MPT-30B-Chat](https://huggingface.co/mosaicml/mpt-30b-chat): a chatbot-like model for dialogue generation.
|
| 48 |
Built by finetuning MPT-30B on [ShareGPT-Vicuna](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered), [Camel-AI](https://huggingface.co/camel-ai),
|
|
|
|
| 60 |
|
| 61 |
## Documentation
|
| 62 |
|
| 63 |
+
* [Blog post: MPT-30B: Raising the bar for open-source foundation models](https://www.mosaicml.com/blog/mpt-30b)
|
| 64 |
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
|
| 65 |
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
|
| 66 |
|
|
|
|
| 235 |
```
|
| 236 |
@online{MosaicML2023Introducing,
|
| 237 |
author = {MosaicML NLP Team},
|
| 238 |
+
title = {Introducing MPT-30B: Raising the bar
|
| 239 |
+
for open-source foundation models},
|
| 240 |
year = {2023},
|
| 241 |
url = {www.mosaicml.com/blog/mpt-30b},
|
| 242 |
note = {Accessed: 2023-06-22},
|