Please add this model to HuggingChat
Hi there!
Would you mind adding this model to huggingchat, since we really want to try it but the openrouter version of it of this model tries to charge us a dime after using a limited free number of prompts ?
We would like to hear from your response!
BTW, if you mind adding this model to huggingchat, feel free to add it to the official github repository: https://github.com/huggingface/chat-ui
Glad to here you like the model :)
Currently we are planning to host a model API (e.g. by using a HF Space) in the future.
Or you could try the quantized model with ollama or llama.cpp, the Q4_K_M
version runs on most devices with low memory footprint and moderate speed :)
Hi there!
Would you mind adding this model to huggingchat, since we really want to try it but the openrouter version of it of this model tries to charge us a dime after using a limited free number of prompts ?
We would like to hear from your response!
BTW, if you mind adding this model to huggingchat, feel free to add it to the official github repository: https://github.com/huggingface/chat-ui
but since all of us want a unified space for all open-source LLMs in huggingchat, would you mind adding this model to huggingchat, since we don't have to download it locally (due to hardware requirements especially with computers with low-end hardware) or run on openrouter (requires a subscription to use unlimited prompts)? By adding to huggingchat you can save us a lot of time, resources and effort by just running it online, since most other LLMs in huggingchat already have community tools. Adding this model to huggingchat will grant your model access to all community tools on huggingchat.