CPALL-Stock-Trend-Prediction-category-sentiment-filter-1stphase-Wangchanberta-APR-2
This model is a fine-tuned version of airesearch/wangchanberta-base-att-spm-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4368
- Accuracy: 0.9274
- Precision: 0.9315
- Recall: 0.9274
- F1: 0.9280
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
No log | 1.0 | 287 | 0.5042 | 0.7994 | 0.8040 | 0.7994 | 0.8004 |
0.617 | 2.0 | 574 | 0.4759 | 0.8292 | 0.8606 | 0.8292 | 0.8314 |
0.617 | 3.0 | 861 | 0.3353 | 0.9055 | 0.9095 | 0.9055 | 0.9062 |
0.2406 | 4.0 | 1148 | 0.3970 | 0.9088 | 0.9151 | 0.9088 | 0.9097 |
0.2406 | 5.0 | 1435 | 0.3643 | 0.9205 | 0.9256 | 0.9205 | 0.9212 |
0.1157 | 6.0 | 1722 | 0.4217 | 0.9184 | 0.9240 | 0.9184 | 0.9190 |
0.0595 | 7.0 | 2009 | 0.4343 | 0.9205 | 0.9269 | 0.9205 | 0.9212 |
0.0595 | 8.0 | 2296 | 0.4867 | 0.9210 | 0.9276 | 0.9210 | 0.9218 |
0.0317 | 9.0 | 2583 | 0.4009 | 0.9290 | 0.9321 | 0.9290 | 0.9295 |
0.0317 | 10.0 | 2870 | 0.4368 | 0.9274 | 0.9315 | 0.9274 | 0.9280 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support