File size: 1,224 Bytes
4427224
 
4b77894
 
 
 
 
 
 
 
 
 
 
 
 
4427224
4b77894
 
 
 
005d8b3
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---

license: gemma
license_link: https://ai.google.dev/gemma/terms
base_model: google/gemma-3-27b-it
finetuned_by: boatbomber
pipeline_tag: text-generation
tags:
- chat
- roblox
- luau
language:
- en
datasets:
- boatbomber/roblox-info-dump
- boatbomber/the-luau-stack
---


# Gemma-3-27B-Roblox-Luau

A fine tune of [google/gemma-3-27b-it](google/gemma-3-27b-it) using [boatbomber/roblox-info-dump](https://huggingface.co/datasets/boatbomber/roblox-info-dump) and [boatbomber/the-luau-stack](https://huggingface.co/datasets/boatbomber/the-luau-stack) for Roblox domain knowledge.

Available quants:

| Quant  | Size    | Notes |
| ------ | ------- | ----- |
| Q8_O   | 30.21GB | High resource use, but generally acceptable. Use only when accuracy is crucial. |

| Q6_K   | 23.32GB | Uses Q6_K for all tensors. Good for high end GPUs. |

| Q5_K_M | 20.24GB | Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q5_K |
| Q4_K_M | 17.34GB | **Recommended.** Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q4_K |

| Q3_K_M | 14.04GB | Uses Q4_K for the attention.wv, attention.wo, and feed_forward.w2 tensors, else Q3_K. Quality is noticeably degraded. |