different parameter from llama.cpp recommended settings?

#6
by CHNtentes - opened

llama.cpp recommends llama-server -hf ggml-org/gpt-oss-120b-GGUF -c 0 -fa --jinja --reasoning-format none

llama.cpp recommends llama-server -hf ggml-org/gpt-oss-120b-GGUF -c 0 -fa --jinja --reasoning-format none

that's for llama-server and the temp and settings etc arent set

Unsloth AI org

We updated the temperature to 1.0 btw!

shimmyshimmer changed discussion status to closed

Sign up or log in to comment