Safetensors
Romanian
llama
Vlad-Andrei Badoiu commited on
Commit
6ddbdf3
·
1 Parent(s): b58e088

Add how to instructions for transformers

Browse files
Files changed (1) hide show
  1. README.md +27 -0
README.md CHANGED
@@ -34,6 +34,33 @@ parameters dense decoder-only Transformer model based on Llama2.
34
 
35
  Our model is designed to accelerate research on Romanian language models, serving as a building block for generative AI applications.
36
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
37
  ## Data Overview
38
 
39
  ### Training Datasets
 
34
 
35
  Our model is designed to accelerate research on Romanian language models, serving as a building block for generative AI applications.
36
 
37
+ ## Use with transformers
38
+
39
+ ```python
40
+ from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
41
+
42
+ device = "cuda"
43
+ model_id = "faur-ai/LLMic"
44
+ prompt = "Capitala României este"
45
+
46
+ model = AutoModelForCausalLM.from_pretrained(model_id).to(device)
47
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
48
+ streamer = TextStreamer(tokenizer)
49
+
50
+ inputs = tokenizer.encode(
51
+ prompt,
52
+ add_special_tokens=False,
53
+ return_tensors='pt',
54
+ ).to(device)
55
+
56
+ outputs = model.generate(
57
+ streamer=streamer,
58
+ input_ids=inputs,
59
+ temperature=0.8,
60
+ do_sample=True
61
+ )
62
+ ```
63
+
64
  ## Data Overview
65
 
66
  ### Training Datasets