adaamko's picture
Update README.md
6de4fce verified
metadata
license: mit
language:
  - en
base_model:
  - jhu-clsp/ettin-encoder-32m
pipeline_tag: token-classification
tags:
  - token classification
  - hallucination detection
  - retrieval-augmented generation
  - transformers
  - ettin
  - lightweight
datasets:
  - ragtruth
  - KRLabsOrg/rag-bioasq-lettucedetect
library_name: transformers

TinyLettuce (Ettin-32M): Efficient Hallucination Detection

TinyLettuce

Model Name: tinylettuce-ettin-32m-en

Organization: KRLabsOrg

Github: https://github.com/KRLabsOrg/LettuceDetect

Ettin encoders: https://arxiv.org/pdf/2507.11412

Overview

TinyLettuce is a lightweight token‑classification model that flags unsupported spans in answers given context (span aggregation performed downstream). Built on the 32M Ettin encoder, it targets real‑time CPU inference and low‑cost domain fine‑tuning. This variant is trained only on our synthetic data and RAGTruth dataset for hallucination detection, using the 32M Ettin encoder and a token‑classification head. Balanced accuracy and speed; CPU‑friendly deployment.

Model Details

  • Architecture: Ettin encoder (32M) + token‑classification head
  • Task: token classification (0 = supported, 1 = hallucinated)
  • Input: [CLS] context [SEP] question [SEP] answer [SEP], up to 4096 tokens
  • Language: English; License: MIT

Training Data

  • RAGTruth (English), span‑level labels; no synthetic data mixed

Training Procedure

  • Tokenizer: AutoTokenizer; DataCollatorForTokenClassification; label pad −100
  • Max length: 4096; batch size: 8; epochs: 3–6
  • Optimizer: AdamW (lr 1e‑5, weight_decay 0.01)
  • Hardware: Single A100 80GB

Results (RAGTruth)

This model is designed primarily for fine-tuning on smaller, domain-specific samples, rather than for general use (though it still performs notably on Ragtruth, 72% vs 76% (our ModernBERT based model)).

Model Parameters F1 (%)
TinyLettuce-32M 32M 72.15
LettuceDetect-base (ModernBERT) 150M 76.07
LettuceDetect-large (ModernBERT) 395M 79.22
Llama-2-13B (RAGTruth FT) 13B 78.70

Usage

First install lettucedetect:

pip install lettucedetect

Then use it:

from lettucedetect.models.inference import HallucinationDetector

detector = HallucinationDetector(
    method="transformer",
    model_path="KRLabsOrg/tinylettuce-ettin-32m-en",
)

spans = detector.predict(
    context=[
        "Ibuprofen is an NSAID that reduces inflammation and pain. The typical adult dose is 400-600mg every 6-8 hours, not exceeding 2400mg daily."
    ],
    question="What is the maximum daily dose of ibuprofen?",
    answer="The maximum daily dose of ibuprofen for adults is 3200mg.",
    output_format="spans",
)
print(spans)
# Output: [{"start": 51, "end": 57, "text": "3200mg"}]

Citing

If you use the model or the tool, please cite the following paper:

@misc{Kovacs:2025,
      title={LettuceDetect: A Hallucination Detection Framework for RAG Applications}, 
      author={Ádám Kovács and Gábor Recski},
      year={2025},
      eprint={2502.17125},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2502.17125}, 
}