Fake Review Detection Model

Model Description

DistilBERT model fine-tuned to detect computer-generated product reviews.

Performance

Metric Real Fake
Precision 0.99 0.97
Recall 0.97 0.99
F1-Score 0.98 0.98

How to Use

from transformers import pipeline

# Load model (replace with your actual username)
classifier = pipeline(
    "text-classification",
    model="debojit01/fake-review-detector"
)

# Example inference
result = classifier("This product is absolutely perfect!")
print(result)  # Output: {'label': 'REAL', 'score': 0.99}

Training Data

  • 20,000 real product reviews (OR)
  • 40,000 computer-generated reviews (CG)
  • 50/50 train-test split

Ethical Considerations

Use responsibly. May reflect biases present in training data.

Downloads last month
10
Safetensors
Model size
67M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using debojit01/fake-review-detector 1