Text Generation
Transformers
Safetensors
GGUF
mistral
roblox
luau
code
sft
trl
unsloth
conversational
text-generation-inference

Luau Devstral 24B Instruct v0.1

A Roblox Luau focused finetune of Devstral Small 2507.

Model Details

Model Description

Devstral Small 2507 is a powerful choice for local inference, achieving SOTA open source results at just 24B parameters. However, Roblox gamedev and Luau programming are generally not well represented in LLM training data. This model fine tunes Devstral on a corpus of permissively licensed Luau code and Roblox documentation, improving the model's Luau programming capabilities. Additionally, the jinja chat template contains a default system prompt that steers the model's Luau capabilities even further.

Model Sources

Training Details

Training Data

  1. https://huggingface.co/datasets/TorpedoSoftware/the-luau-stack

    25.917M lines of real Luau code, 0.452B tokens. Format:

    Repository: {repo_name}
    Repository Description: {repo_description}
    
    File Path: `{file_path}`
    File Content:
    ```Lua
    {file_content}
    ```\
    
  2. https://huggingface.co/datasets/TorpedoSoftware/roblox-info-dump

    19.6K pages of multilingual Roblox documentation, 0.149B tokens. Format:

    Roblox Creator Docs: {url}
    ```md
    {content}
    ```\
    

Training Process

Trained a LoRA adapter (r=64) at full precision on two epochs of the dataset for a total of 54,630 steps and 43.40 E FLOPs. Then merged the final adapter checkpoint into a BF16 model.

Training Loss Curve

Training Loss Curve

Imatrix Calibration

The imatrix for the GGUF quantizations was computed using 5.73MB of text containing a combination of technical.txt, groups_merged.txt, and content from the-luau-stack & roblox-info-dump. This created an imatrix that is well suited to the specialized tasks this model is designed for while still maintaining broader intelligence as well. While we do provide several quantizations already, the imatrix.gguf is included in this repository should you want to create other quants yourself.

Environmental Impact

Carbon emissions estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: A100 80GB PCIe
  • Hours used: 60
  • Carbon Emitted: ~4.5 kg CO2eq (equivalent to ~10.1 miles driven by an average ICE car)
Downloads last month
2,346
Safetensors
Model size
23.6B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for TorpedoSoftware/Luau-Devstral-24B-Instruct-v0.1

Datasets used to train TorpedoSoftware/Luau-Devstral-24B-Instruct-v0.1