merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using unsloth/Mistral-Small-Instruct-2409 as a base.



license:

πŸ’» Usage

!pip install -qU transformers accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "Kaoeiri/MS-Hempantheonsel-Mull-v5x1.8RP-Cydonia-vXXX-24B-12"
messages = [{"role": "user", "content": "What is a large language model?"}]

tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    torch_dtype=torch.float16,
    device_map="auto",
)

outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Downloads last month
39
Safetensors
Model size
23.6B params
Tensor type
BF16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Kaoeiri/MS-Hempantheonsel-Mull-v5x1.8RP-Cydonia-vXXX-24B-12

Finetuned
(5)
this model
Merges
2 models
Quantizations
3 models