Upload README.md
Browse files
README.md
CHANGED
|
@@ -15,7 +15,7 @@ datasets:
|
|
| 15 |
- camel-ai/chemistry
|
| 16 |
- winglian/evals
|
| 17 |
inference: false
|
| 18 |
-
license:
|
| 19 |
model_creator: Open Access AI Collective
|
| 20 |
model_name: Minotaur 13B Fixed
|
| 21 |
model_type: llama
|
|
@@ -78,15 +78,8 @@ A chat between a curious user and an artificial intelligence assistant. The assi
|
|
| 78 |
```
|
| 79 |
|
| 80 |
<!-- prompt-template end -->
|
| 81 |
-
<!-- licensing start -->
|
| 82 |
-
## Licensing
|
| 83 |
|
| 84 |
-
The creator of the source model has listed its license as `apache-2.0`, and this quantization has therefore used that same license.
|
| 85 |
|
| 86 |
-
As this model is based on Llama 2, it is also subject to the Meta Llama 2 license terms, and the license files for that are additionally included. It should therefore be considered as being claimed to be licensed under both licenses. I contacted Hugging Face for clarification on dual licensing but they do not yet have an official position. Should this change, or should Meta provide any feedback on this situation, I will update this section accordingly.
|
| 87 |
-
|
| 88 |
-
In the meantime, any questions regarding licensing, and in particular how these two licenses might interact, should be directed to the original model repository: [OpenAccess AI Collective's Minotaur 13B Fixed](https://huggingface.co/openaccess-ai-collective/minotaur-13b-fixed).
|
| 89 |
-
<!-- licensing end -->
|
| 90 |
<!-- README_GPTQ.md-provided-files start -->
|
| 91 |
## Provided files and GPTQ parameters
|
| 92 |
|
|
|
|
| 15 |
- camel-ai/chemistry
|
| 16 |
- winglian/evals
|
| 17 |
inference: false
|
| 18 |
+
license: other
|
| 19 |
model_creator: Open Access AI Collective
|
| 20 |
model_name: Minotaur 13B Fixed
|
| 21 |
model_type: llama
|
|
|
|
| 78 |
```
|
| 79 |
|
| 80 |
<!-- prompt-template end -->
|
|
|
|
|
|
|
| 81 |
|
|
|
|
| 82 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 83 |
<!-- README_GPTQ.md-provided-files start -->
|
| 84 |
## Provided files and GPTQ parameters
|
| 85 |
|