🧠 Irixxed-Magcap-12B-Slerp

Merge power meets pure reasoning finesse.

Starting with Violet_Magcap-12B, and blended in the smooth strength of Irix-12B-Model_Stock.

No gimmicks, just synergy:
A classic Slerp merge crafted for sharp reasoning and solid performance—because why settle for one when you can have both?


ChatML Format

ChatML Format

⚙️ Usage Presets

🎛️ SillyTavern Presets


💾 Quantized Versions

Q8_0 (GGUF)
Q5_K_M (GGUF)

Q4_K_M (GGUF)
4bpw (ExL2)


🛠️ Model Details

Feature Description
Base Models Violet_Magcap-12B + Irix-12B-Model_Stock
Size 12B Parameters
Library Transformers
Merge Type Regular Slerp

📦 Reasoning Information

Reasoning Block + Prefix

Reasoning Format

Quick Reply's

Quick Reply Preset

⚙️ Merge-kit Config

  slices:
  - sources:
      - model: DreadPoor/Irix-12B-Model_Stock
        layer_range: [0, 40]
      - model: Nitral-AI/Violet_Magcap-12B
        layer_range: [0, 40]
merge_method: slerp
base_model: DreadPoor/Irix-12B-Model_Stock
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.50
dtype: bfloat16

🌀 Vibe Check

Synergy in code, clarity in reasoning.
Use it wisely—or just enjoy the smooth ride.

🧬 Created by: Nitral-AI

Downloads last month
400
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Nitral-AI/Irixxed-Magcap-12B-Slerp