mT5-large-HuAMR / README.md
BotondBarta's picture
Update README.md
c8e4d11 verified
|
raw
history blame
1.53 kB
---
library_name: transformers
tags: []
---
# Model Card for mT5-large-HuAMR
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model is a fine-tuned version of google/mt5-large on the None dataset. It achieves the following results on the evaluation set:
- **Model type:** Abstract Meaning Representation parser
- **Language(s) (NLP):** Hungarian
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/botondbarta/HuAMR/tree/master/huamr
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32 -->
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]