Model Card for Patient-Friendly Clinical Discharge Summarizer (T5-small)

Model Details

Model Description

Fine-tuned google/t5-small to simplify behavioral health discharge notes (MIMIC-IV-BHC) into patient-friendly summaries. Part of the Patient-Friendly Summarization of Clinical Discharge Notes project; also explored with a RAG variant for added factual grounding.

  • Developed by: Dhyan Patel & Vidit Gandhi
  • Model type: Encoder–decoder transformer (seq2seq)
  • Language(s): English
  • License: Apache-2.0
  • Finetuned from: google/t5-small

Model Sources

  • Repository:
  • Dataset: MIMIC-IV-BHC (PhysioNet)

Uses

Direct Use

  • Generate lay summaries from behavioral health discharge notes.

Downstream Use

  • Embed in EHR/patient portals; pair with RAG to ground definitions and instructions.

Out-of-Scope

  • Automated diagnosis/prescribing or any unsupervised clinical decision-making.

Bias, Risks, and Limitations

  • May omit subtle clinical nuance; behavioral health notes can include sensitive content—human review is required.
  • Risk of hallucinations if fed incomplete context.

How to Get Started

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

model_id = "your-username/patient-friendly-mimic-iv-bhc-t5-small"
tok = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForSeq2SeqLM.from_pretrained(model_id)

text = "Paste a BHC discharge note..."
inputs = tok(text, return_tensors="pt", truncation=True, max_length=512)
summary_ids = model.generate(**inputs, max_length=128)
print(tok.decode(summary_ids[0], skip_special_tokens=True))
Downloads last month
32
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support