๐ Other Resources
Learn how to use Hugging Face in Google Cloud by reading our blog posts, presentations, Google documentation and examples below.
Blog posts
- Hugging Face and Google partner for open AI collaboration
- Google Cloud TPUs made available to Hugging Face users
- Making thousands of open LLMs bloom in the Vertex AI Model Garden
- Deploy Meta Llama 3.1 405B on Google Cloud Vertex AI
Presentations
Google Documentation
- Google Cloud Hugging Face Deep Learning Containers
- Google Cloud public Artifact Registry for DLCs
- Serve Gemma open models using GPUs on GKE with Hugging Face TGI
- Generative AI on Vertex - Use Hugging Face text generation models
Examples
Vertex AI
Inference
- Deploy BERT Models with PyTorch Inference DLC on Vertex AI
- Deploy Embedding Models with TEI DLC on Vertex AI
- Deploy FLUX with PyTorch Inference DLC on Vertex AI
- Deploy Gemma 7B with TGI DLC from GCS on Vertex AI
- Deploy Gemma 7B with TGI DLC on Vertex AI
- Deploy Llama 3.2 11B Vision with TGI DLC on Vertex AI
- Deploy Meta Llama 3.1 405B with TGI DLC on Vertex AI
Training
Evaluation
GKE
Inference
- Deploy BGE Base v1.5 with TEI DLC from GCS on GKE
- Deploy Gemma2 with multiple LoRA adapters with TGI DLC on GKE
- Deploy Llama 3.1 405B with TGI DLC on GKE
- Deploy Llama 3.2 11B Vision with TGI DLC on GKE
- Deploy Meta Llama 3 8B with TGI DLC on GKE
- Deploy Qwen2 7B with TGI DLC from GCS on GKE
- Deploy Snowflakeโs Arctic Embed with TEI DLC on GKE
Training