What is the device specification required to run Gemma-3n, locally?
#6
by
Aayush22
- opened
I am planning to use the model locally, but all the Gemma models seems to be over 5B params, as per google deep mind information... Gemma-3n can be used to build ai apps, which can be used offline. Do model over 5B params, will be able to run on mobile devices? and even while building what is the specifications of PC required to build platform using Gemma-3n.