HammaLoRAMarBert
Advanced Arabic Dialect Classification Model with Complete Training Metrics
Full Training History
epoch | train_loss | eval_loss | train_accuracy | eval_accuracy | f1 | precision | recall |
---|---|---|---|---|---|---|---|
1 | 1.23777 | 1.22883 | 0.696741 | 0.706742 | 0.679064 | 0.704547 | 0.696741 |
2 | 0.740169 | 0.733708 | 0.790397 | 0.800562 | 0.791608 | 0.797789 | 0.790397 |
3 | 0.601572 | 0.617834 | 0.818182 | 0.821348 | 0.819564 | 0.824729 | 0.818182 |
4 | 0.562756 | 0.585901 | 0.824363 | 0.816292 | 0.825464 | 0.835567 | 0.824363 |
5 | 0.497183 | 0.534541 | 0.839411 | 0.832022 | 0.839956 | 0.842141 | 0.839411 |
6 | 0.467484 | 0.529349 | 0.848964 | 0.830899 | 0.850348 | 0.855113 | 0.848964 |
7 | 0.447877 | 0.52692 | 0.851773 | 0.832022 | 0.852826 | 0.857268 | 0.851773 |
8 | 0.44038 | 0.525875 | 0.854021 | 0.830337 | 0.855092 | 0.860913 | 0.854021 |
9 | 0.416875 | 0.513681 | 0.863886 | 0.835955 | 0.865207 | 0.870201 | 0.863886 |
10 | 0.397198 | 0.498091 | 0.868506 | 0.839888 | 0.869502 | 0.872867 | 0.868506 |
11 | 0.396181 | 0.509205 | 0.86757 | 0.835955 | 0.869238 | 0.875968 | 0.86757 |
12 | 0.38368 | 0.494237 | 0.873064 | 0.838764 | 0.87361 | 0.875448 | 0.873064 |
13 | 0.377543 | 0.496908 | 0.874001 | 0.83764 | 0.874749 | 0.877947 | 0.874001 |
14 | 0.371016 | 0.491708 | 0.877435 | 0.841573 | 0.878057 | 0.880101 | 0.877435 |
15 | 0.370049 | 0.493832 | 0.877872 | 0.840449 | 0.878651 | 0.881198 | 0.877872 |
Label Mapping:
{0: 'Egypt', 1: 'Iraq', 2: 'Lebanon', 3: 'Morocco', 4: 'Saudi_Arabia', 5: 'Sudan', 6: 'Tunisia'}
USAGE Example:
from transformers import pipeline
classifier = pipeline(
"text-classification",
model="Hamma-16/HammaLoRAMarBert",
device="cuda" if torch.cuda.is_available() else "cpu"
)
sample_text = "ุดูููู ุงูููู
ุ"
result = classifier(sample_text)
print(f"Text: {sample_text}")
print(f"Predicted: {result[0]['label']} (confidence: {result[0]['score']:.1%})")
- Downloads last month
- 27
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Hamma-16/HammaLoRAMarBERT-v1
Base model
UBC-NLP/MARBERT