Smaller Version Request

#5
by oguzhanercan - opened

Hi, thanks for your great work, best VL model I have ever tried but it is almost impossible to deploy. Are you planning to distill it to smaller models like DeepSeek did?

Completely agree! MiniMaxAI/MiniMax-VL-01 is hands down the best VL model right now, but deployment is a real challenge. A distilled version, like what DeepSeek did, could make it way more practical while keeping its incredible VL capabilities. If not distillation, at least a well-optimized quantized version could help. Hoping the team considers this!

MiniMax org

Hello, if we have any plans regarding this, we will inform you and other developers in advance. Thank you for your support of our MiniMax VL-01 model. We invite you to follow the future progress of our model!

MiniMax org
This comment has been hidden
MiniMax-AI changed discussion status to closed

Sign up or log in to comment