This Model
This is the chat model continue-pretrain and after sft on top of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T.
How to use
You will need the transformers>=4.34 Do check the TinyLlama github page for more information.
# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="lab-ii/TinyLlama-Sakha-Instruct", torch_dtype=torch.bfloat16, device_map="auto")
prompt_input = (
"Below is an instruction that describes a task. "
"Write a response that appropriately completes the request.\n\n"
"### Instruction:\n\n{instruction}\n\n### Response:\n\n"
)
raw_input_text = "Доруобай буолар кына үс сүбэни биэр"
promnt = generate_prompt(instruction=raw_input_text)
outputs = pipe(promnt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
Доруобай буолар кына үс сүбэни биэр
### Response:
1. Аһылыккын тутус уонна элбэх фруктаны уонна хортуоппуйу сиэ.
2. Этиҥ-сииниҥ көхтөөх уонна күүстээх буоларын туһугар өрүү дьарыктан.
3. Ситэри утуй уонна биир тэҥ утуйар графигы тутус.
- Downloads last month
- 7