Ollama Availability
#16
by
PlayAI
- opened
When will the model be available in Ollama?
Hi @PlayAI - currently the model is not supported in llama.cpp (which is what Ollama depends on), once it is supported it should work in Ollama. We currently don't have active initiatives to add support in llama.cpp currently but hoping that we have free cycles to support soon.
Thank you for your response. I appreciate you and your team's work.