🚀 ZeroGPU medium size is now available as a power-user feature
Nothing too fancy for now—ZeroGPU Spaces still default to large (70GB VRAM)—but this paves the way for: - 💰 size-based quotas / pricing (medium will offer significantly more usage than large) - 🦣 the upcoming xlarge size (141GB VRAM)
You can as of now control GPU size via a Space variable. Accepted values: - auto (future default) - medium - large (current default)
The auto mode checks total CUDA tensor size during startup: - More than 30GB → large - Otherwise → medium
Hi everyone! I'm Alex, I'm 16, I've been an internship at Hugging Face for a little over a week and I've already learned a lot about using and prompting LLM models. With @victor as tutor I've just finished a space that analyzes your feelings by prompting an LLM chat model. The aim is to extend it so that it can categorize hugging face posts.