Thai-FakeNews-BERT
A Thai-language BERT model fine-tuned for fake news detection. This model is part of a Senior Project by CPE35 students from King Mongkut's University of Technology Thonburi (KMUTT).
Model Description
- Base model:
monsoon-nlp/bert-base-thai
- Dataset:
EXt1/Thai-True-Fake-News
- Model Size: 105M parameters
- Task: Text Classification
- Language: Thai
- Labels:
- 0: True News
- 1: Fake News
Evaluation Results
- Loss: 0.3976
- Accuracy: 85% on the test set
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
tokenizer = AutoTokenizer.from_pretrained("EXt1/ThaiFakeNews-BERT")
model = AutoModelForSequenceClassification.from_pretrained("EXt1/ThaiFakeNews-BERT")
text = "เตรียมรับมือ พายุฤดูร้อนพัดถล่ม 26-28 เม.ย. 68"
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True)
with torch.no_grad():
logits = model(**inputs).logits
predicted_class = torch.argmax(logits, dim=1).item()
if predicted_class == 1:
print("ข่าวปลอม")
else:
print("ข่าวจริง")
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for EXt1/ThaiFakeNews-BERT
Base model
monsoon-nlp/bert-base-thai