π Emotion Classifier β Fine-tuned DistilBERT
This is a DistilBERT-based model fine-tuned for emotion classification on the dair-ai/emotion
dataset.
It predicts one of 6 emotion labels for a given text:
sadness
joy
love
anger
fear
surprise
π§ Model Details
- Base model:
distilbert-base-uncased
- Fine-tuned on:
dair-ai/emotion
- Task: Multi-class single-label text classification
- Framework: π€ Transformers
π Dataset
Dataset: dair-ai/emotion
Split | Samples |
---|---|
Train | 16,000 |
Validation | 2,000 |
Test | 2,000 |
Total | 20,000 |
Classes (labels):
0
β sadness1
β joy2
β love3
β anger4
β fear5
β surprise
π Usage
You can use the model directly with the π€ Transformers pipeline:
from transformers import pipeline
classifier = pipeline("text-classification", model="shivvamm/emotion-distilbert-finetuned", top_k=None)
text = "I feel hopeful and excited about the future."
results = classifier(text)
print(results)
Example output:
[{'label': 'joy', 'score': 0.9876}]
β Intended Use
- Social media and product review emotion analysis
- Sentiment & psychological tone detection
- Personal journaling or well-being apps
- Academic research in NLP and affective computing
β οΈ Limitations
- Handles English language only
- Classifies only one dominant emotion per text
- May misclassify mixed emotions, sarcasm, or idioms
π Citation
@misc{shivvamm2025emotion,
title={Emotion Classification using DistilBERT},
author={Shivvamm},
year={2025},
howpublished={\url{https://huggingface.co/shivvamm/emotion-distilbert-finetuned}},
note={Fine-tuned on the dair-ai/emotion dataset}
}
π€ Author
Shivvamm
Model: shivvamm/emotion-distilbert-finetuned
License: MIT
π‘ Fine-tuned with π€ Hugging Face Transformers and Accelerate
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support