momo's picture
Update README.md
793d22f
---
license: apache-2.0
language:
- ko
---
## Model Details
**Model Developers** Yunho Mo (momo)
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture**
polyglot-ko-12.8b-Chat-QLoRA-Merge is an auto-regressive language model based on the polyglot-ko-12.8b transformer architecture.
**Base Model**
[polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b)
**Training Dataset**
I use [KOpen-platypus](https://huggingface.co/datasets/kyujinpy/KOpen-platypus), [ko-lima](https://huggingface.co/datasets/taeshahn/ko-lima), [EverythingLM-data-V2-Ko](https://huggingface.co/datasets/ziozzang/EverythingLM-data-V2-Ko).