HammaLoRACAMeLBert
Advanced Arabic Dialect Classification Model with Complete Training Metrics
Full Training History
epoch | train_loss | eval_loss | train_accuracy | eval_accuracy | f1 | precision | recall |
---|---|---|---|---|---|---|---|
1 | 1.37716 | 1.3692 | 0.53022 | 0.531461 | 0.473072 | 0.541513 | 0.53022 |
2 | 1.07614 | 1.06994 | 0.624063 | 0.636517 | 0.613924 | 0.630747 | 0.624063 |
3 | 0.968308 | 0.956658 | 0.662962 | 0.675281 | 0.661112 | 0.674352 | 0.662962 |
4 | 0.908012 | 0.900243 | 0.680569 | 0.692697 | 0.67889 | 0.688134 | 0.680569 |
5 | 0.858731 | 0.850955 | 0.699051 | 0.711236 | 0.70055 | 0.711083 | 0.699051 |
6 | 0.83265 | 0.825597 | 0.708417 | 0.716854 | 0.709791 | 0.724241 | 0.708417 |
7 | 0.796107 | 0.792507 | 0.722777 | 0.727528 | 0.723871 | 0.733254 | 0.722777 |
8 | 0.774045 | 0.773437 | 0.729645 | 0.732584 | 0.732394 | 0.747566 | 0.729645 |
9 | 0.758434 | 0.762295 | 0.736763 | 0.737079 | 0.739592 | 0.753967 | 0.736763 |
10 | 0.74339 | 0.7477 | 0.743194 | 0.744382 | 0.74493 | 0.757555 | 0.743194 |
11 | 0.730814 | 0.737005 | 0.74975 | 0.748876 | 0.751424 | 0.760475 | 0.74975 |
12 | 0.731264 | 0.74201 | 0.747378 | 0.748315 | 0.749058 | 0.760792 | 0.747378 |
13 | 0.719465 | 0.729863 | 0.753184 | 0.750562 | 0.754813 | 0.765065 | 0.753184 |
14 | 0.71225 | 0.722574 | 0.756868 | 0.754494 | 0.758437 | 0.767501 | 0.756868 |
15 | 0.712998 | 0.724446 | 0.756244 | 0.755618 | 0.757984 | 0.767426 | 0.756244 |
Label Mapping:
{0: 'Egypt', 1: 'Iraq', 2: 'Lebanon', 3: 'Morocco', 4: 'Saudi_Arabia', 5: 'Sudan', 6: 'Tunisia'}
USAGE Example:
from transformers import pipeline
classifier = pipeline(
"text-classification",
model="Hamma-16/HammaLoRACAMeLBert",
device="cuda" if torch.cuda.is_available() else "cpu"
)
sample_text = "ุดูููู ุงูููู
ุ"
result = classifier(sample_text)
print(f"Text: {sample_text}")
print(f"Predicted: {result[0]['label']} (confidence: {result[0]['score']:.1%})")
- Downloads last month
- 15
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Hamma-16/HammaLoRACAMeLBERT-v1
Base model
CAMeL-Lab/bert-base-arabic-camelbert-mix