Llama 3.2 Amharic
Collection
Llama 3.2 decoder transformer models trained on Amharic text
β’
12 items
β’
Updated
β’
1
This model is an Instruction-Tuned version of Llama 3.2 400M Amharic.
Given the nature of the training data, the phi-2 instruct model is best suited for prompts using the chat format as follows. You can provide the prompt as a question with a generic template as follows:
<|im_start|>user
α₯α«α?<|im_end|>
<|im_start|>assistant
For example:
<|im_start|>user
αΆα΅α΅ α¨α ααͺα« ααα«α΅ α₯αα΅αα<|im_end|>
<|im_start|>assistant
where the model generates the text after <|im_start|>assistant
.
First, you need to install the latest version of transformers
pip install -Uq transformers
You can use this model directly with a pipeline for text generation:
from transformers import pipeline
llama3_am = pipeline(
"text-generation",
model="rasyosef/Llama-3.2-400M-Amharic-Instruct",
device_map="auto"
)
messages = [{"role": "user", "content": "α°αα"}]
llama3_am(messages, max_new_tokens=128, repetition_penalty=1.1, return_full_text=False)
Output:
[{'generated_text': 'α°αα! αα¬ α₯αα΄α΅ αα¨α³αα΅ α₯α½ααα? π'}]
messages = [{"role": "user", "content": "α΅α α
α₯α α αα α³ α°α¨α΅ ααα¨α"}]
llama3_am(messages, max_new_tokens=128, repetition_penalty=1.1, return_full_text=False)
Output:
[{'generated_text': 'α α₯α«α± α¦αͺα α¨α°α°α¨α¨ αα α α
α¦α½α α αα΅ α αα α³ α α°α αα₯α°α α α΅α ααα½α α α¬αα½α α α΅αα α«αα‘α‘ α αα α³αα βα₯α αα αα αα΅α α₯ααα° α αα±α α΅αα΅α³αα½αα‘α‘β α²ααΈα α
α¦αΉα βαα
α₯αα΄α΅ αααα? α₯α αα α ααα α³α α αα° αα αα αα΅α°α
αα₯α α αα΅ αα΅α α΅α
αααα ααα?β α αα΅α‘α‘ α αα α³αα βα α α₯α α αα΅ ααα‘α‘ αα αα α΅αα΅α΅ α α΅α α₯αααααα‘α‘ α₯ααα°α αα α αα½αα‘α‘ α αα·α αα α΅α΅α¨αα© α α΅α α΅αααα½αα‘α‘β α ααΈαα‘α‘ α
α¦αΉα α¨α αα α³αα α α΅ααͺ α ααα½ α α°ααα¨α± αα α΅ααα©α΅ αα° α α£α³αΈα α ααα΅ α αα α³α αα₯ααα± α αα΅ α°α₯α·αΈα αα«α± αα α α₯αα°αα°α°α ααΈαα α αα α³α α₯α αα α'}]
messages = [{"role": "user", "content": "α΅α αα
αα³ αα₯α α»ααα"}]
llama3_am(messages, max_new_tokens=128, repetition_penalty=1.1, return_full_text=False)
Output:
[{'generated_text': 'αα
αα³ αα
αα³\nαα
αα³ α₯αα΄\nα α°α£α£α ααα αααα©α΅ αααα΄\nα³αα½ αα΅α½ α ααα α΅αα ααα½ α ααα\nα αα₯α© α₯αα»αα α¨αα₯α½ α°ααα\nαα
αα³ α α£α΄\nα α°α£α£α ααα αααα©α΅ α΅α΅α΄\nα¨αα
αα΅ αα αααα ααα΅α
\nα¨αα΄ ααα αα αα΅ ααα΅α
\nαα
αα³ ααα΅α\nα α°α£α£α ααα αααα©α΅ α
αα\nα¨α α» αααα³α½α ααα
α©α΅ αα£α΅α
\nα΅α α ααα΅ α¨α₯αΆα
α΅α₯α αααα³α΅α
\nαα
αα³ α₯α
α΄\nα α°α£α£α ααα αααα©α΅ α½ααα΄\nα α₯α¬α α α₯α¬α α΅α αα£αα©α΅ α α α£α½\nα α³α'}]
Base model
rasyosef/Llama-3.2-400M-Amharic