TraductorEnEs
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-es on an unknown dataset.
Model description
This is an AI model generated(AI genrate an AI model)
Intended uses & limitations
It's an educational attemp so you can do what the licence indicates.
Training and evaluation data
Just used 50000 training sentences and traductions.
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
Training results
It's really good to see that spanish traductions are acurate, we didn't expect it would work so well.
Framework versions
- Transformers 4.52.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 92
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Harolin/TraductorEnEs
Base model
Helsinki-NLP/opus-mt-en-es