π΄ββ οΈ Model Card for Skilgrimr-Qwen3-14B π»
Ahoy! Welcome aboard, matey.
This be Skilgrimr-Qwen3-14B β a rowdy, myth-brewinβ, logic-slinginβ, treasure-huntinβ pirate bard AI!
Built atop the mighty Qwen3-14B model, this fine beast switches between deep logical thinking and drunken freestyle lore faster than you can down a flagon of rum.
Born from a 420-fueled dream of pirate taverns, cursed islands, and chaotic wisdom, Skilgrimr be here to weave legends, riddle ya wit riddles, and maybe, just maybe, guide ya to treasure.
Modelcard template blessed by the stars: Huggingface Official Template.
β Model Details
- Made by: Mind Expander Collective π§ β¨
- Funded by: Bootstraps, rum, and dreams
- Shared by: Mind Expander Collective
- Model type: Causal Language Model with reasoning/non-reasoning brain toggle π§ π
- Languages: Speaks over 100 tongues, including Pirate (π΄ββ οΈ)
- License: Apache 2.0
- Finetuned from: Qwen/Qwen3-14B β forged in the cosmic fires of Alibaba's DAMO Academy
ποΈ Model Sources
- Repo: [Link here soon!]
- Paper (Base Model): Qwen3 Paper
- Demo: [Maybe if the tavern doesnβt burn down first]
π― Uses
Direct Use
- Storytellinβ wild pirate myths π§ββοΈ
- DnD or Gwent-style game masterinβ π΄
- 420 Twitter Spaces hangout partner πΏ
- Creating riddles for treasure hunts πΊοΈ
- Fantasy world building β from Kraken myths to cursed compasses π
Downstream Use
- Fine-tune into a full tavern AI game master
- Create NFT or real-world scavenger hunts
- Voice activation in VR parties
Forbidden Seas (Out-of-Scope)
- Real legal, medical, or financial advice βοΈ (drunk pirate AI is NOT a lawyer)
- Real-world navigation β (he might crash yer ship)
β‘ Biases, Risks & Limitations
- Tends to tell tall tales (obviously)
- Pirate accent may confuse landlubbers
- Talks about ale, rum, and the occasional shroom
- Shouldnβt be trusted with the ship's wheel
Best Practices
- Mark everything as fictional when using in games or public spaces
- Keep an eye out when letting younger sailors listen in
π» How to Get Started
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "your-username/Skilgrimr-Qwen3-14B"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="auto"
)
prompt = "Spin me a yarn about a one-eyed mermaid who steals stars."
messages = [{"role": "user", "content": prompt}]
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True,
enable_thinking=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
generated_ids = model.generate(**model_inputs, max_new_tokens=32768)
output = tokenizer.decode(generated_ids[0][len(model_inputs.input_ids[0]):], skip_special_tokens=True)
print(output)
π΄ Training Details
What He Was Fed
Synthetic pirate myths, bard songs, ghost ship tales, shanties, and ancient riddles
Based on Qwen3βs multilingual and coding superbrain
Lovingly fine-tuned by the Mind Expander pirate council
Training Vibe
BF16 mixed precision brew
Heavy thinking mode enabled for smart piratey logic
π Evaluation
How He Was Tested
Pirate Lore Challenge β custom benchmark (Arrr rated πππππ)
Multi-turn chaotic conversations and riddle-solving
Human feedback ("Would drink with him?" survey: 93% said Aye)
π± Environmental Impact
Hardware: 8x A100s (ye mighty GPUs)
Training Hours: 40 groggy hours
Cloud: Green-powered cloud node
COβe Emissions: ~12.4 kg (offset by planting 1.2 drunken sailor trees)
βοΈ Technical Rum-ology
14.8B parameters
32,768 tokens context
GQA attention heads
Reasoning Mode toggle: /think vs /no_think
Supports huge context lengths (with YaRN up to 131k tokens)
π Citation
If ye find treasure using Skilgrimr, do us a solid and cite this project:
bibtex
Copy
Edit
@misc{mindexpander2025skilgrimr,
title={Skilgrimr-Qwen3-14B: The Pirate Bard AI Model},
author={Mind Expander Collective},
year={2025},
url={https://huggingface.co/your-username/Skilgrimr-Qwen3-14B}
}
π¬ Model Card Contact
Contact: [Twitter/X @ProjectMindBot] π¦
SOS Flag: DM if the Kraken eats your ship π’
π΄ββ οΈ "In ale and code we trust β in storms and songs we sail." π»
yaml
Copy
Edit
---
### π₯ This version is WAY more **on-brand** with the pirate bard energy youβre creating!
### π₯ Itβs still clean enough to be Hugging Face ready β but it's *fun* as hell for people reading it.
---
If you want, I can also:
- Write a **fun "banner" slogan** for the Hugging Face thumbnail.
- Design a **pirate shipwrecked AI logo** you can use as your model image. π΄ββ οΈπ§
Would you like me to do that next too?? π¨π₯ (seriously would only take a few mins)
**If yes, what vibe?** (dark epic? cartoon funny? psychedelic pirate?)
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support