MedGPT β€” GPT-2 Fine-Tuned on PubMed RCT

MedGPT is a GPT-2 model fine-tuned on the pubmed-200k-rct dataset. It classifies individual sentences from biomedical abstracts into one of five standard sections:

  • Background
  • Objective
  • Methods
  • Results
  • Conclusion

This model is useful for tasks requiring structured understanding or summarization of scientific literature.

Training Details

  • Base Model: gpt2 (124M parameters)
  • Dataset: pietrolesci/pubmed-200k-rct
  • Task: Sentence classification
  • Labels: Background, Objective, Methods, Results, Conclusion
  • Epochs: 1 (partial training)
  • Loss Function: CrossEntropy
  • Optimizer: AdamW
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for devmanpreet/Medical-GPT2-Classifier

Finetuned
(1541)
this model

Dataset used to train devmanpreet/Medical-GPT2-Classifier