KRONOS-8B-V1-P3 / README.md
T145's picture
Upload folder using huggingface_hub
5ab222b verified
|
raw
history blame
854 Bytes
metadata
library_name: transformers
tags:
  - merge
  - llama-3.1
  - roleplay
  - function calling
base_model:
  - unsloth/Meta-Llama-3.1-8B-Instruct
  - REILX/Llama-3-8B-Instruct-750Mb-lora
datasets:
  - databricks/databricks-dolly-15k
  - microsoft/orca-math-word-problems-200k
  - LooksJuicy/ruozhiba
base_model_relation: merge

KRONOS V1 P3

This is a merge of Meta Llama 3.1 Instruct and REILIX's "750MB" LORA, created using llm-tools.

The primary purpose of this model is to be merged into other models in the same family using the TIES merge method.

Creating quants for this is entirely unnecessary.

Merge Details

Configuration

The following Bash command was used to produce this model:

python /llm-tools/merge-lora.py -m unsloth/Meta-Llama-3.1-8B-Instruct -l REILX/Llama-3-8B-Instruct-750Mb-lora