BERT transformer encoder models pretrained on 290 million tokens of Amharic text
Yosef Worku Alemneh
rasyosef
AI & ML interests
Pretraining, Supervised Fine Tuning, Direct Preference Optimization, Retrieval Augmented Generation (RAG), Function Calling
Organizations
None yet
Collections
5
spaces
9
models
44
rasyosef/phi-2-apo-down
Updated
rasyosef/phi-2-instruct-apo
Text Generation
•
Updated
•
51
rasyosef/phi-2-apo
Updated
•
71
rasyosef/phi-1_5-sft-v4-merged
Text Generation
•
Updated
•
20
rasyosef/phi-2-instruct-v0.2
Text Generation
•
Updated
•
79
rasyosef/phi-2-dpo-v3
Updated
rasyosef/Mistral-NeMo-Minitron-8B-Chat
Text Generation
•
Updated
•
244
•
6
rasyosef/Llama-3.1-Minitron-4B-Chat
Text Generation
•
Updated
•
163
•
3
rasyosef/SmolLM-1.7B-sft-v2
Updated
rasyosef/phi-2-instruct-v0.1
Text Generation
•
Updated
•
117
datasets
6
rasyosef/ultrafeedback-orca-math-dpo
Viewer
•
Updated
•
43.7k
•
19
rasyosef/OpenHermes-SLM-384k
Viewer
•
Updated
•
384k
•
2
rasyosef/amharic-sentences-corpus
Viewer
•
Updated
•
6.44M
•
7
•
1
rasyosef/amharic-named-entity-recognition
Viewer
•
Updated
•
3.47k
•
2
rasyosef/amharic-sentiment
Viewer
•
Updated
•
2.78k
•
4
rasyosef/amharic-news-category-classification
Viewer
•
Updated
•
50k
•
15