|
---
|
|
license: gemma
|
|
license_link: https://ai.google.dev/gemma/terms
|
|
base_model: google/gemma-3-27b-it
|
|
finetuned_by: boatbomber
|
|
pipeline_tag: text-generation
|
|
tags:
|
|
- chat
|
|
- roblox
|
|
- luau
|
|
language:
|
|
- en
|
|
datasets:
|
|
- boatbomber/roblox-info-dump
|
|
- boatbomber/the-luau-stack
|
|
---
|
|
|
|
# Gemma-3-27B-Roblox-Luau
|
|
|
|
A fine tune of [google/gemma-3-27b-it](google/gemma-3-27b-it) using [boatbomber/roblox-info-dump](https://huggingface.co/datasets/boatbomber/roblox-info-dump) and [boatbomber/the-luau-stack](https://huggingface.co/datasets/boatbomber/the-luau-stack) for Roblox domain knowledge.
|
|
|
|
Available quants:
|
|
|
|
| Quant | Size | Notes |
|
|
| ------ | ------- | ----- |
|
|
| Q8_O | 30.21GB | High resource use, but generally acceptable. Use only when accuracy is crucial. |
|
|
| Q6_K | 23.32GB | Uses Q6_K for all tensors. Good for high end GPUs. |
|
|
| Q5_K_M | 20.24GB | Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q5_K |
|
|
| Q4_K_M | 17.34GB | **Recommended.** Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q4_K |
|
|
| Q3_K_M | 14.04GB | Uses Q4_K for the attention.wv, attention.wo, and feed_forward.w2 tensors, else Q3_K. Quality is noticeably degraded. |
|
|
|