Barcenas 27b

Based on the SillyTilly/google-gemma-2-27b-it and trained with the dataset pinzhenchen/alpaca-cleaned-es in the Spanish language.

The goal of this model is to have a relatively large model optimized in Spanish and that was at the level of the first versions of GPT-4.

I am proud of this model for being the biggest and most powerful one I have done, no doubt it is the result of my short stay in the AI world.

Made with โค๏ธ in Guadalupe, Nuevo Leon, Mexico ๐Ÿ‡ฒ๐Ÿ‡ฝ

Downloads last month
6
Safetensors
Model size
27.2B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Danielbrdz/Barcenas-27b

Quantizations
2 models