metadata
base_model:
- Shome/croguana-RC2
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- gguf
license: cc-by-sa-4.0
language:
- hr
pipeline_tag: text2text-generation
datasets:
- timdettmers/openassistant-guanaco
- mhardalov/exams

Uploaded model
- Developed by: Shome
- License: cc-by-sa-4.0
- Finetuned from model : gordicaleksa/YugoGPT
Model prompt:
"### Korisnik:/n[upit]/n### AI asistent:/n[odgovor]/n"
Fine tuning je za chat mode, gornji template se može produžiti koliko je potrebno.
Ctx size u trainu je 8192
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.