Will there be an 20B version of internlm4? Will there be a VLM of about 20B based on internlm4 20B or Qwen3 14B?
#2
by
lingyezhixing
- opened
200B+ is really too large for individuals and studios to deploy locally, hoping there will be a smaller, high-performance model of the same type around 20B
A lite version is on the way.