MagicPrompt_SD_V1
This is a Prompt Generator likes Gustavosta/MagicPrompt-Stable-Diffusion!
But I wash the origin prompts data, and trains a powerful model to generate prompt for ้ญๅฏผ็ปช่ฎบ
It's using Paddle to handle the training and other things. Not PyTorch or Tensorsflow.
There's the result I get form this model:
- You can use CPU to run the model! But GPU 10x faster then CPU ๐.
- CPU (about 300ms/per ) | GPU (about 90ms/per ๐ ) V2-10 Model
- You can add some change easier passing some params.
๐ Using example is here
You can wrapper a FastAPI or Flask to easily deploy it to your server
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.