β–„β–„β–„β–„β–„    β–„β–ˆβ–ˆβ–ˆβ–„   β–„β–ˆβ–„      β–„      β–„   β–ˆβ–ˆβ–„   β–ˆβ–ˆ      β–ˆβ–ˆβ–ˆβ–ˆ    β–„β–ˆβ–ˆβ–ˆβ–ˆβ–„
       β–ˆ    β–€β–„  β–ˆβ–€   β–€  β–ˆβ–€ β–€β–„     β–ˆ      β–ˆ  β–ˆ  β–ˆ  β–ˆ β–ˆ     β–ˆ   β–ˆ    β–ˆ 
    β–„   β–€β–€β–€β–€β–„   β–ˆβ–ˆβ–„β–„    β–ˆ   β–€  β–ˆ   β–ˆ β–ˆβ–ˆ   β–ˆ β–ˆ   β–ˆ β–ˆβ–„β–„β–ˆ     β–ˆ   β–ˆ    β–ˆβ–„β–„β–„β–„β–„
     β–€β–€β–„β–„β–„β–„β–€    β–ˆβ–„   β–„β–€ β–ˆβ–„  β–„β–€ β–ˆ   β–ˆ β–ˆ β–ˆ  β–ˆ β–ˆ  β–ˆ  β–ˆ  β–ˆ      β–ˆ   β–ˆ          β–ˆ 
               β–€β–ˆβ–ˆβ–ˆβ–€   β–€β–ˆβ–ˆβ–ˆβ–€  β–ˆβ–„ β–„β–ˆ β–ˆ  β–ˆ β–ˆ β–ˆβ–ˆβ–ˆβ–€      β–ˆ       β–ˆβ–ˆβ–ˆβ–ˆβ–€ β–β–ˆ  β–„β–„β–„β–„β–ˆ  
                               β–€β–€β–€  β–ˆ   β–ˆβ–ˆ         β–ˆ                          
                                                 β–€                           
   
                      β‹†β‹†ΰ­¨ΰ­§Λš THE PRIMΓ‰TOILE ENGINE Λšΰ­¨ΰ­§β‹†ο½‘Λšβ‹†
                  β€” Visual Novel generation under starlight β€”
Version Type Strengths Weaknesses Recommended Use
Secunda-0.1-GGUF / RAW Instruction - Most precise
- Coherent code
- Perfected Modelfile
- Smaller context / limited flexibility Production / Baseline
Secunda-0.3-F16-QA QA-based Input - Acceptable for question-based generation - Less accurate than 0.1
- Not as coherent
Prototyping (QA mode)
Secunda-0.3-F16-TEXT Text-to-text - Flexible for freeform tasks - Slightly off
- Modelfile-dependent
Experimental / Text rewrite
Secunda-0.3-GGUF GGUF build - Portable GGUF of 0.3 - Inherits 0.3 weaknesses Lightweight local testing
Secunda-0.5-RAW QA Natural - Best QA understanding
- Long-form generation potential
- Inconsistent output length
- Some instability
Research / Testing LoRA
Secunda-0.5-GGUF GGUF build - Portable, inference-ready version of 0.5 - Shares issues of 0.5 Offline experimentation
Secunda-0.1-RAW Instruction - Same base as 0.1-GGUF - Same as 0.1 Production backup

β‹†βΊβ‚Šβ‹†Secunda ☾ 0.5 RAWβ‹†βΊβ‚Šβ‹†

For more concise and controlled results, I recommend using Secunda-0.1-GGUF (instruct) or Secunda-0.3-GGUF (natural)

Secunda-0.5-RAW is the third iteration of the Secunda visual novel generation series, built to transform natural language story prompts into fully structured, high-quality Ren'Py scripts β€” without relying on explicit instructions or formatting. It was fine-tuned on a carefully curated dataset of fictional .rpy scenes, each paired with its original narrative concept.

This model is much heavier than its siblings: Secunda-0.1-RAW, Secunda-0.3-F16-QA and Secunda-0.3-F16-TEXT.
Secunda is now based on both direct QA and structured prompting.

πŸš€ Model Highlights

  • Instruction-Free Generation: Unlike Secunda-0.1, this version relies only on a natural prompt such as "A detective wakes up in a town where no one remembers him."
  • Finetuned with QLoRA (FP16): Efficient low-resource finetuning using 16-bit precision on LLaMA 3.1 8B.
  • Script-Style Outputs: Generates structured .rpy files, often including characters, backgrounds, dialogue, and a return.
  • More Creative Freedom: Produces more diverse narratives and styles, occasionally multi-scene outputs.

/!\ NO HUMAN-MADE DATA WAS USED TO TRAIN THIS AI ! Secunda takes much pride in making sure the training data is scripted ! /!\

If you like Visual Novels, please visit itch.io and support independant creators !

🧠 Training Details

  • Base model: meta-llama/Meta-Llama-3.1-8B
  • Finetuning: QLoRA 4-bit adapters (FP16 runtime)
  • Data: 800+ handcrafted prompt/script pairs in .jsonl, each with a raw natural language idea and a full .rpy script.
  • Hardware: NVIDIA RTX 4070
  • Training time: ~12 hours, 20 epochs

πŸ“ Prompt Style

This model works without instructions. Just write your idea naturally:

"An abandoned cafΓ© reopens every full moon to serve ghosts their last coffee."


⚠️ Known Issues

  • May occasionally produce multiple unrelated scripts in one output if token limit is too high.
  • Less deterministic than instruction-based versions β€” retries may help.
  • Not always guaranteed to end with return (you may add it manually).

πŸ“Œ Tips

  • Keep prompts between 10–30 words for best coherence.
  • If the output includes multiple scenes, you can split them manually.
  • If needed, combine with sanitize_output() logic from Secunda-0.1 to postprocess outputs.

πŸ”’ Privacy

This model is private and intended for research and internal tooling for the PrimΓ©toile visual novel engine.


πŸ“š License

Apache 2.0 β€” For research, testing, and development only.


✨ Credits

Trained and maintained by Yaroster for the Secunda engine inside the PrimΓ©toile framework.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support