Neuria_BERT_Relacionados

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0564
  • Accuracy: 0.9817

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 64
  • total_train_batch_size: 4096
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.373 1.0 1 0.6457 0.7232
0.3475 2.0 2 0.6094 0.8538
0.3267 3.0 3 0.5733 0.8590
0.3079 4.0 4 0.5341 0.8851
0.2874 5.0 5 0.4910 0.9164
0.266 6.0 6 0.4456 0.9478
0.2446 7.0 7 0.3995 0.9661
0.2211 8.0 8 0.3539 0.9661
0.1988 9.0 9 0.3109 0.9687
0.1777 10.0 10 0.2716 0.9687
0.1552 11.0 11 0.2371 0.9713
0.1361 12.0 12 0.2084 0.9687
0.1209 13.0 13 0.1847 0.9661
0.1057 14.0 14 0.1648 0.9687
0.0921 15.0 15 0.1480 0.9687
0.0817 16.0 16 0.1335 0.9687
0.0717 17.0 17 0.1208 0.9713
0.0636 18.0 18 0.1100 0.9713
0.055 19.0 19 0.1013 0.9739
0.0514 20.0 20 0.0943 0.9739
0.0445 21.0 21 0.0888 0.9739
0.042 22.0 22 0.0845 0.9765
0.0359 23.0 23 0.0812 0.9765
0.0343 24.0 24 0.0787 0.9739
0.0301 25.0 25 0.0763 0.9765
0.0278 26.0 26 0.0739 0.9765
0.0258 27.0 27 0.0717 0.9765
0.0242 28.0 28 0.0698 0.9765
0.0233 29.0 29 0.0681 0.9765
0.0211 30.0 30 0.0665 0.9765
0.0199 31.0 31 0.0647 0.9791
0.0181 32.0 32 0.0631 0.9791
0.0176 33.0 33 0.0615 0.9791
0.0163 34.0 34 0.0603 0.9791
0.0157 35.0 35 0.0594 0.9817
0.0148 36.0 36 0.0589 0.9817
0.0135 37.0 37 0.0584 0.9817
0.0138 38.0 38 0.0579 0.9817
0.0124 39.0 39 0.0577 0.9817
0.0126 40.0 40 0.0574 0.9817
0.0128 41.0 41 0.0571 0.9817
0.0122 42.0 42 0.0569 0.9817
0.0118 43.0 43 0.0568 0.9817
0.0121 44.0 44 0.0565 0.9817
0.0116 45.0 45 0.0566 0.9791
0.012 46.0 46 0.0564 0.9817
0.0114 47.0 47 0.0564 0.9791
0.0104 48.0 48 0.0564 0.9817
0.0108 49.0 49 0.0564 0.9817
0.0104 50.0 50 0.0564 0.9817

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.4.1
  • Datasets 2.19.1
  • Tokenizers 0.21.0
Downloads last month
5
Safetensors
Model size
110M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for neuria99/Neuria_BERT_Relacionados

Finetuned
(97)
this model