πŸ€– LoRA-BERT for Sentiment Analysis (SST-2)

This is a lightweight, parameter-efficient BERT model fine-tuned with LoRA (Low-Rank Adaptation) for binary sentiment classification on the SST-2 dataset.


πŸ’‘ Model Highlights

  • βœ… Fine-tuned using LoRA (r=8, Ξ±=16) on top of bert-base-uncased
  • βœ… Trained on SST2
  • βœ… Achieves ~91.17% validation accuracy
  • βœ… Lightweight: only LoRA adapter weights are updated

πŸ“Š Results

Epoch Training Loss Validation Loss Accuracy
1 0.3030 0.2467 89.91%
2 0.1972 0.2424 90.94%
3 0.2083 0.2395 91.17%
4 0.1936 0.2464 90.94%
5 0.1914 0.2491 90.83%

Early stopping could be applied from Epoch 3 based on validation metrics.


πŸ› οΈ Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification
from peft import PeftModel, PeftConfig

model_id = "Harsh-Gupta/bert-lora-sentiment"

# Load PEFT config + model
config = PeftConfig.from_pretrained(model_id)
base_model = AutoModelForSequenceClassification.from_pretrained(config.base_model_name_or_path)
model = PeftModel.from_pretrained(base_model, model_id)

# Tokenizer
tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)

# Predict
text = "This movie was absolutely amazing!"
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True)
with torch.no_grad():
    outputs = model(**inputs)
    probs = outputs.logits.softmax(dim=-1)
    pred = probs.argmax().item()

LoRA Configuration

LoraConfig(
    r=32,
    lora_alpha=4,
    target_modules=["query", "value"],
    lora_dropout=0.1,
    bias="none",
    task_type="SEQ_CLS"
)

πŸ” Intended Use

  • Sentiment classification for binary text (positive/negative)

  • Can be adapted to other domains: movie reviews, product reviews, tweets


🧠 Author

  • Harsh Gupta
  • MCA, Jawaharlal Nehru University (JNU)
  • GitHub: 2003Harsh
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Harsh-Gupta/Sentiment-Analysis-BERT-sst2

Adapter
(93)
this model

Dataset used to train Harsh-Gupta/Sentiment-Analysis-BERT-sst2

Evaluation results