DeBERTa v3 Emotion Classifier

Model description

This repository contains a DeBERTa v3 base model fine-tuned for emotion classification on the dair-ai/emotion dataset. The model is intended for short-text emotion labeling and was finetuned with standard Trainer-based training on Google Colab / Drive.

Use cases

  • Classifying short texts into emotion categories for downstream workflows (analytics, moderation, UX signals).
  • Human-in-the-loop pipelines where low-confidence outputs trigger clarification.

Dataset

  • dair-ai/emotion (public dataset for emotion labeling).

Training summary

  • Base model: microsoft/deberta-v3-base
  • Fine-tuning method: full fine-tuning (Trainer)
  • Number of labels: 6
  • Training environment: Google Colab

intended use

This model is intended to support emotion classification tasks. It may produce incorrect or biased outputs when used on out-of-distribution text, long-form inputs, or languages it was not trained on. Use a confidence-based fallback for important decisions and include human review for high-stakes applications.

How to load

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
repo_id = ragunath-ravi/deberta-v3-emotion-classifier
tokenizer = AutoTokenizer.from_pretrained(repo_id)
model = AutoModelForSequenceClassification.from_pretrained(repo_id)
device = "cuda" if torch.cuda.is_available() else "cpu"
model.to(device)
model.eval()

Example inference

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
import torch.nn.functional as F
tokenizer = AutoTokenizer.from_pretrained(ragunath-ravi/deberta-v3-emotion-classifier)
model = AutoModelForSequenceClassification.from_pretrained(ragunath-ravi/deberta-v3-emotion-classifier)
inputs = tokenizer("I am so happy today!", return_tensors="pt", truncation=True, padding=True)
with torch.no_grad():
    logits = model(**inputs).logits
probs = F.softmax(logits, dim=-1).cpu().numpy()[0]
print(probs)

Acknowledgements

  • This model is based on Microsoft DeBERTa v3 base.
  • Dataset: dair-ai/emotion.
  • Transformers library: Hugging Face transformers.

Model demo : demo

Downloads last month
58
Safetensors
Model size
0.2B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ragunath-ravi/deberta-v3-emotion-classifier

Finetuned
(445)
this model

Dataset used to train ragunath-ravi/deberta-v3-emotion-classifier

Space using ragunath-ravi/deberta-v3-emotion-classifier 1