maguette9315's picture
End of training
71c8ee0 verified
|
raw
history blame
2.06 kB
metadata
library_name: transformers
language:
  - nl
license: mit
base_model: microsoft/speecht5_tts
tags:
  - generated_from_trainer
datasets:
  - maguette
model-index:
  - name: SpeechT5 TTS MT V0
    results: []

SpeechT5 TTS MT V0

This model is a fine-tuned version of microsoft/speecht5_tts on the maguette dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2207

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • training_steps: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
No log 6.6667 10 1.9046
No log 13.3333 20 1.5552
1.9023 20.0 30 1.4780
1.9023 26.6667 40 1.4256
1.567 33.3333 50 1.4126
1.567 40.0 60 1.3005
1.567 46.6667 70 1.2734
1.4002 53.3333 80 1.2259
1.4002 60.0 90 1.1843
1.2804 66.6667 100 1.2207

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.1.0
  • Tokenizers 0.20.3