
Hebrew Models
Open source models pretrained in hebrew
Text Generation • Updated • 6.58k • 69Note Current state-of-the-art base model trained in Hebrew. Continuously pre-trained from Mistral-7B, vocab extended with additional 32,000 hebrew tokens.
yam-peleg/Hebrew-Mistral-7B-200K
Text Generation • Updated • 3.08k • 15Note Hebrew-Mistral-7B Continuously pre-trained with 200K context window.
yam-peleg/Hebrew-Mixtral-8x22B
Text Generation • Updated • 30 • 23Note The largest Hebrew base model at the moment. Important note: Under-trained comparing to the others.
yam-peleg/Hebrew-Gemma-11B
Text Generation • Updated • 62 • 37Note Previous state-of-the-art base model trained in Hebrew. Continuously pre-trained from Gemma-7B and extended to 11B parameters.
yam-peleg/Hebrew-Gemma-11B-V2
Text Generation • Updated • 6.9k • 12Note Updated: V2! Previous state-of-the-art base model trained in Hebrew. Continuously pre-trained from Gemma-7B and extended to 11B parameters.
yam-peleg/Hebrew-Gemma-11B-Instruct
Text Generation • Updated • 4.12k • 23Note Instruct fine tune of Hebrew-Gemma-11B. Quickly trained for demonstration purposes.