T5-Large Fine-tuned on XSum

Task: Abstractive Summarization (English)
Base Model: google-t5/t5-large
License: MIT

Overview

This model is a T5-Large checkpoint fine-tuned exclusively on the XSum dataset. It specializes in generating concise, single-sentence summaries in the style of BBC article abstracts.

Performance ~ On XSum test set

Metric Score
ROUGE-1 26.89
ROUGE-2 6.94
ROUGE-L 21.28
Loss 2.54
Avg. Length 18.77 tokens

Usage

Quick Start

from transformers import pipeline

summarizer = pipeline("summarization", model="sysresearch101/t5-large-finetuned-xsum")

article = "Your article text here..."
summary = summarizer(article, max_length=80, min_length=20, do_sample=False)
print(summary[0]['summary_text'])

Advanced Usage

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("sysresearch101/t5-large-finetuned-xsum")
model = AutoModelForSeq2SeqLM.from_pretrained("sysresearch101/t5-large-finetuned-xsum")

inputs = tokenizer("summarize: " + article, return_tensors="pt", max_length=512, truncation=True)
outputs = model.generate(
    **inputs,
    max_length=80,
    min_length=20,
    num_beams=4,
    no_repeat_ngram_size=2,
    length_penalty=1.0,
    repetition_penalty=2.5,
    use_cache=True,
    early_stopping=True
    do_sample = True,
    temperature = 0.8,
    top_k = 50,
    top_p = 0.95
)

summary = tokenizer.decode(outputs[0], skip_special_tokens=True)

Training Data

  • XSum: BBC articles paired with professionally written single-sentence summaries

Intended Use

  • Primary: Summarization
  • Secondary: Research on extreme summarization, single-sentence summary generation, Educational demonstrations, comparative studies with multi-sentence models
  • Not recommended: Multi-sentence summarization tasks, production use without validation

Limitations

  • Trained only on news domain; may not generalize to other text types
  • Generates very short summaries (average ~19 tokens)
  • May oversimplify complex topics due to single-sentence constraint

Citation

@misc{stept2023_t5_large_xsum,
  author = {Shlomo Stept (sysresearch101)},
  title = {T5-Large Fine-tuned on XSum for Abstractive Summarization},
  year = {2023},
  publisher = {Hugging Face},
  url = {https://huggingface.co/sysresearch101/t5-large-finetuned-xsum}
}

Papers Using This Model

Contact

Created by Shlomo Stept (ORCID: 0009-0009-3185-589X) DARMIS AI

Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for sysresearch101/t5-large-finetuned-xsum

Base model

google-t5/t5-large
Finetuned
(124)
this model

Dataset used to train sysresearch101/t5-large-finetuned-xsum

Evaluation results