GGUF
llama.cpp

Is there any hope to make 500M and 1B parameters like apple OpenELM?

#4
by yousef1727 - opened

I'm just wondering what if there is these sizes? that would be nice special on Android device

Some apps need tiny, mini or medium models for simple tasks not for QnA, i mean summarize, explain, text editing, etc

I so trust on google and I know maybe I'll see more models on future, Google Gemma is the only model who understands Arabic because Gemma Tokenizer split word not letters like gpt which mean he can understand meaning between words not letters, that reduce mistakes,

Finally I want to repeat Apple made big family of sizes why google don't?

Hi @yousef1727 , Great news! Based on your feedback, the Gemma team has released smaller Gemma 3 models, including the Gemma3-1b (both pretrained and instruction-tuned) specifically for text generation. We hope this meets your needs!

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment