--- tags: - generated_from_trainer datasets: - danish_legal_pile metrics: - accuracy model-index: - name: danish-legal-longformer-base-mlm results: - task: name: Masked Language Modeling type: fill-mask dataset: name: danish_legal_pile type: danish_legal_pile metrics: - name: Accuracy type: accuracy value: 0.8285689003181987 --- # danish-legal-longformer-base-mlm This model is a fine-tuned version of [data/plms/danish-legal-longformer-base](https://huggingface.co/data/plms/danish-legal-longformer-base) on the danish_legal_pile dataset. It achieves the following results on the evaluation set: - Loss: 0.7374 - Accuracy: 0.8286 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.05 - training_steps: 64000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.7425 | 14.69 | 32000 | 0.7502 | 0.8259 | | 0.7257 | 29.37 | 64000 | 0.7368 | 0.8287 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.12.0+cu113 - Datasets 2.0.0 - Tokenizers 0.12.1