You need to agree to share your contact information to access this model

The information you provide might be collected, stored, processed and shared.

###VTE-BERT Hugging Face Model Usage Terms
The hosted AI model contains weights derived from datasets that may include Protected Health Information (PHI). Users must agree to the following:
- The model is provided strictly for research and development purposes.
- It must not be used for diagnostic, treatment, or clinical decision-making.
- Appropriate security and compliance measures must be implemented.
- Any model modifications must comply with AGPLv3.
- Reverse engineering or attempts to extract training data from the model are strictly prohibited.

Log in or Sign Up to review the conditions and access this model content.

VTE-BERT Model Card

This repository contains a fine-tuned Bio_ClinicalBERT model (VTE-BERT) specialized for text classification tasks related to venous thromboembolism (VTE) in clinical narratives from patients diagnosed with cancer at Harris Health System (HHS) between 2011 and 2020.

Model Details

  • Developed by: Ang Li Lab
  • Base Model: Bio_ClinicalBERT
  • Fine-tuning Dataset: Clinical notes of cancer patients from HHS (2011-2020)
  • Task: Text classification
  • Labels:
    • LE-DVT+
    • PE+
    • UE-DVT
    • atypical
    • history
    • none

Intended Use

This model is intended for use in research settings to classify clinical text according to the aforementioned labels related to VTE outcomes in cancer patients. It may assist clinical researchers, informaticists, and healthcare professionals in efficiently analyzing large volumes of clinical notes.

Usage

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline

model_name = "ang-li-lab/VTE-BERT"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
classifier = pipeline("text-classification", model=model, tokenizer=tokenizer)

text = "Your text here."
result = classifier(text)
print(result)

Access and Licensing

This model is available under gated access. Please review the TERMS.md file for terms and conditions of use.

The model code and data are subject to the LICENSE included in this repository.

Resources

  • Demo: Demo Website
  • Publication:
  • GitHub: NLPMed-Engine, a robust and extensible NLP engine tailored for medical text, internally using VTE-BERT.

Contact

For questions or inquiries, please contact our lab through our lab website.

Citation

If you utilize this model in your research, please cite our paper:

@article{your_article_citation,
  author = {Your Authors},
  title = {Title of Your Paper},
  journal = {Journal Name},
  year = {Year},
  volume = {Volume},
  pages = {Pages},
  doi = {DOI}
}
Downloads last month
-
Safetensors
Model size
108M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ang-li-lab/VTE-BERT

Finetuned
(34)
this model