|
--- |
|
language: |
|
- "tr" |
|
tags: |
|
- "turkish" |
|
- "pos" |
|
- "dependency-parsing" |
|
base_model: HPLT/hplt_bert_base_tr |
|
datasets: |
|
- "universal_dependencies" |
|
license: "apache-2.0" |
|
pipeline_tag: "token-classification" |
|
--- |
|
|
|
# ltgbert-base-turkish-ud-goeswith |
|
|
|
## Model Description |
|
|
|
This is a LTG-BERT model pretrained for POS-tagging and dependency-parsing (using `goeswith` for subwords), derived from [hplt_bert_base_tr](https://huggingface.co/HPLT/hplt_bert_base_tr). |
|
|
|
## How to Use |
|
|
|
```py |
|
from transformers import pipeline |
|
nlp=pipeline("universal-dependencies","KoichiYasuoka/ltgbert-base-turkish-ud-goeswith",trust_remote_code=True,aggregation_strategy="simple") |
|
print(nlp("Ay dağın diğer tarafında yükseldi")) |
|
``` |
|
|
|
|