Google

company
Verified
Activity Feed

AI & ML interests

Google ❤️ Open Source AI

Recent Activity

Articles

google 's collections 39

Flan-T5 release
The Flan-T5 covers 4 checkpoints of different sizes each time. It also includes upgrades versions trained using Universal sampling
Switch-Transformers release
This release included various MoE (Mixture of expert) models, based on the T5 architecture . The base models use from 8 to 256 experts.
TimesFM Release
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
MedGemma Release
Collection of Gemma 3 variants for performance on medical text and image comprehension to accelerate building healthcare-based AI applications.
Gemma 3 QAT
Quantization Aware Trained (QAT) Gemma 3 checkpoints. The model preserves similar quality as half precision while using 3x less memory
Health AI Developer Foundations (HAI-DEF)
Groups models released for use in health AI by Google. Read more about HAI-DEF at https://developers.google.com/health-ai-developer-foundations
BERT release
Regroups the original BERT models released by the Google team. Except for the models marked otherwise, the checkpoints support English.
T5 release
The original T5 transformer release was done in two steps, the original T5 checkpoints and the improved T5v1
SigLIP
Contrastive (sigmoid) image-text models from https://arxiv.org/abs/2303.15343
MedGemma Release
Collection of Gemma 3 variants for performance on medical text and image comprehension to accelerate building healthcare-based AI applications.
Gemma 3 QAT
Quantization Aware Trained (QAT) Gemma 3 checkpoints. The model preserves similar quality as half precision while using 3x less memory
Health AI Developer Foundations (HAI-DEF)
Groups models released for use in health AI by Google. Read more about HAI-DEF at https://developers.google.com/health-ai-developer-foundations
BERT release
Regroups the original BERT models released by the Google team. Except for the models marked otherwise, the checkpoints support English.
Flan-T5 release
The Flan-T5 covers 4 checkpoints of different sizes each time. It also includes upgrades versions trained using Universal sampling
T5 release
The original T5 transformer release was done in two steps, the original T5 checkpoints and the improved T5v1
Switch-Transformers release
This release included various MoE (Mixture of expert) models, based on the T5 architecture . The base models use from 8 to 256 experts.
SigLIP
Contrastive (sigmoid) image-text models from https://arxiv.org/abs/2303.15343
TimesFM Release
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.