Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
marcelone 's Collections
Language Learning - Moe - 12 GB GPU + 32 GB RAM
Useful for learning Chinese - RTX 3060 12GB
Uncensored Models - RTX 3060 12 GB
Useful for learning Russian - RTX 3060 12GB
Language Learning - 8GB GPUs
Useful for learning Portuguese - RTX 3060 12GB
Useful for learning Japanese - RTX 3060 12GB
Useful for learning Spanish - RTX 3060 12GB
Models for the RTX 3060 12GB
RP Models for low-budget GPUs

Language Learning - Moe - 12 GB GPU + 32 GB RAM

updated 3 days ago

Force Model Experts Weights onto CPU

Upvote
-

  • lmstudio-community/gpt-oss-20b-GGUF

    21B • Updated 17 days ago • 530k • 53

    Note MXFP4_MOE 12.1 GB Languages: Chinese, Japanese, Russian, English, Spanish etc.


  • marcelone/Jinx-Qwen3-30B-A3B-Thinking-2507-gguf

    31B • Updated 3 days ago • 142 • 1

    Note MXFP4_MOE 17.1 GB Languages: Chinese

Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs