Sarah-h-h's picture
Update README.md
7fb38c1 verified
metadata
license: mit
language:
  - en
pipeline_tag: text-classification
tags:
  - open-source
  - binary-classification
  - sst-2
  - distilbert
  - sentiment-analysis

Model Card: Sentiment Classifier (DistilBERT - SST-2)

Overview

This model is a fine-tuned version of distilbert-base-uncased on the SST-2 dataset, designed for binary sentiment classification: labeling text as either positive or negative.

It’s fast, compact, and suitable for real-time inference tasks such as social media monitoring, customer feedback triage, and lightweight embedded NLP.


Use Cases

  • Detecting sentiment in tweets, reviews, or comments
  • Routing customer support tickets by tone
  • Analyzing product sentiment in e-commerce or app stores
  • Monitoring brand perception over time

Example

Input: "This new update is amazing — so much faster!"
Output: Positive

Input: "This feature is broken and support isn't helping."
Output: Negative

---

## Strengths

- Extremely lightweight: good for mobile and low-latency use
- Fine-tuned on a benchmark sentiment dataset (SST-2)
- Strong out-of-the-box performance for informal English

---

## Limitations

- Binary only (positive/negative) — no neutral or nuanced emotion
- Trained on English movie reviews — may misinterpret sarcasm, cultural tone, or domain-specific feedback
- Not ideal for clinical, legal, or safety-critical sentiment tasks

---

## Model Details

- Architecture: DistilBERT
- Base model: `distilbert-base-uncased`
- Fine-tuning dataset: SST-2 (Stanford Sentiment Treebank)
- Max input: 512 tokens
- Classes: `Positive`, `Negative`

---

## License

MIT License — free to use, adapt, and deploy commercially.

---

## Authorship Note

This model card was written by [Sarah Mancinho](https://huggingface.co/Sarah-h-h) as part of a public AI/LLM contribution series on Hugging Face.

Original model: [`distilbert-base-uncased-finetuned-sst-2-english`](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english)

---

## Citation