Updated README
Browse files
README.md
CHANGED
@@ -1,3 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
# Llama-3-8B-Instruct-ov-fp16-int4-sym
|
2 |
|
3 |
## Built with Meta Llama 3
|
@@ -34,4 +50,4 @@ model = OVModelForCausalLM.from_pretrained(model_id)
|
|
34 |
|
35 |
pipeline = transformers.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto")
|
36 |
pipeline("Hey how are you doing today?")
|
37 |
-
```
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
pipeline_tag: text-generation
|
5 |
+
tags:
|
6 |
+
- OpenVINO
|
7 |
+
- meta
|
8 |
+
- llama
|
9 |
+
- llama-3
|
10 |
+
- PyTorch
|
11 |
+
license: llama3
|
12 |
+
extra_gated_prompt: |
|
13 |
+
|
14 |
+
Meta Llama 3 Version Release Date: April 18, 2024
|
15 |
+
library_name: transformers
|
16 |
+
---
|
17 |
# Llama-3-8B-Instruct-ov-fp16-int4-sym
|
18 |
|
19 |
## Built with Meta Llama 3
|
|
|
50 |
|
51 |
pipeline = transformers.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto")
|
52 |
pipeline("Hey how are you doing today?")
|
53 |
+
```
|